this post was submitted on 02 Dec 2023
30 points (100.0% liked)

SneerClub

1011 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

At various points, on Twitter, Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.” ...

When he’s not tweeting about e/acc, Verdon runs Extropic, which he started in 2022. Some of his startup capital came from a side NFT business, which he started while still working at Google’s moonshot lab X. The project began as an April Fools joke, but when it started making real money, he kept going: “It's like it was meta-ironic and then became post-ironic.” ...

On Twitter, Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 year ago (1 children)

In its reaction against both EA and AI safety advocates, e/acc also explicitly pays tribute to another longtime Silicon Valley idea. “This is very traditional libertarian right-wing hostility to regulation," said Benjamin Noys, a professor of critical theory at the University of Chichester and scholar of accelerationism. Jezos calls it the “libertarian e/acc path.”

At least the Italian futurists were up front about their agenda.

“We’re trying to solve culture by engineering,” Verdon said. “When you're an entrepreneur, you engineer ways to incentivize certain behaviors via gradients and reward, and you can program a civilizational system."

Reading Nudge to engineer the 'Volksschädling' to board the trains voluntarily. Dusting off the old state eugenics compensation programs.

[–] [email protected] 16 points 1 year ago (2 children)

The fuck do they mean "solve culture"? Is culture a problem to be solved? Actually don't answer that.

[–] [email protected] 10 points 1 year ago (2 children)

even more horrifying — they see culture as a system of equations they can use AI to generate solutions for, and the correct set of solutions will give them absolute control over culture. they apply this to all aspects of society. these assholes didn’t understand hitchhiker’s guide to the galaxy or any of the other sci fi they cribbed these ideas from, and it shows

[–] [email protected] 10 points 1 year ago (1 children)

It's like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

[–] [email protected] 8 points 1 year ago (1 children)

remember that Yudkowsky's CEV idea was literally to analytically solve ethics

[–] [email protected] 8 points 1 year ago (2 children)

In an essay that somehow manages to offhandendly mention both evolutionary psychology and hentai anime in the same paragraph.

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

It's like when he wore a fedora and started talking about 4chan greentexts in his first major interview. He just cannot help himself.

P.S. The New York Times recently listed "internet philosopher" Eliezer Yudkowsky as one of the of the major figures in the modern AI movement, this is the picture they chose to use.

[–] [email protected] 9 points 1 year ago (1 children)

this is the picture they chose to use.

You may not like it, but this is what peak rationality looks like.

[–] [email protected] 7 points 1 year ago (1 children)

the cookie cutter glasses are load bearing

[–] [email protected] 5 points 1 year ago

That hat could only work with New Year's Eve glasses from a year with "00".

[–] [email protected] 9 points 1 year ago (1 children)

The ultimate STEMlord misunderstanding of culture; something absolutely rife in the Silicon Valley tech-sphere.

[–] [email protected] 10 points 1 year ago (1 children)

These dudes wouldn't recognize culture if unsafed its Browning and shot them in the kneecaps.

[–] [email protected] 5 points 1 year ago (1 children)

Now im wondering, does the ma deuce have a safety? This is important information if I have ever have to defend my atoll from the E/acc smoker attack

[–] [email protected] 5 points 1 year ago

Apparently not; some soldiers appear to have bodged their own safeties by doing things like jamming an expended case underneath the trigger.

[–] [email protected] 9 points 1 year ago (1 children)

Don’t have to have Culture War when you can just systemically deploy the exact culture you want right from the comfort of your prompt, amirite?!

(This is a shitpost idea but it’s probably halfway accurate, maybe modulo the prompt (but there will definitely be someone also trying that))

[–] [email protected] 8 points 1 year ago (1 children)
[–] [email protected] 8 points 1 year ago* (last edited 1 year ago)

The best use case for Urbit is marking its proponents as first up against the wall when the revolution comes.