this post was submitted on 30 Nov 2024
31 points (100.0% liked)

SneerClub

1010 readers
25 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

https://nonesense.substack.com/p/lesswrong-house-style

Given that they are imbeciles given, occasionally, to dangerous ideas, I think it’s worth taking a moment now and then to beat them up. This is another such moment.

you are viewing a single comment's thread
view the rest of the comments
[–] captainlezbian@lemmy.world 7 points 2 weeks ago (15 children)

That sounds like a religion insisting it isn’t one

[–] AnarchistArtificer@slrpnk.net 6 points 2 weeks ago* (last edited 2 weeks ago) (8 children)

They do seem to worship Bayes

Edit: I want to qualify that I'm a big fan of Bayes Theorem — in my field, there's some awesome stuff being done with Bayesian models that would be impossible to do with frequentist statistics. Any scorn in my comment is directed at the religious fervour that LW directs at Bayesian statistics, not at the stats themselves.

I say this to emphasise that LWers aren't cringe for being super enthusiastic about maths. It's the everything else that makes them cringe

[–] Amoeba_Girl@awful.systems 11 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

The particular way they invoke Bayes' theorem is fascinating. They don't seem to ever actually use it in any sort of rigorous way, it's merely used as a way to codify their own biases. It's an alibi for putting a precise percentage point on your vibes. It's kind of beautiful in a really stupid sort of way.

[–] blakestacey@awful.systems 11 points 2 weeks ago (1 children)

They take a theory that is supposed to be about updating one's beliefs in the face of new evidence, and they use it as an excuse to never change what they think.

It's the Bayesian version of Zeno's paradox. Before one can update their beliefs, one must have evidence of an alternative proposition. But no one piece of evidence is worth meaningfully changing your worldview and actions. In order to be so it would need to be supported. But then that supporting evidence would itself need to be supported. And so on ad infinitum.

load more comments (5 replies)
load more comments (5 replies)
load more comments (11 replies)