this post was submitted on 05 Dec 2023
15 points (100.0% liked)

SneerClub

983 readers
37 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

I was wondering if someone here has a better idea of how EA developed in its early days than I do.

Judging by the link I posted, it seems like Yudkowsky used the term "effective altruist" years before Will MacAskill or Peter Singer adopted it. The link doesn't mention this explicitly, but Will MacAskill was also a lesswrong user, so it seems at least plausible that Yudkowsky is the true father of the movement.

I want to sort this out because I've noticed that a recently lot of EAs have been downplaying the AI and longtermist elements within the movement and talking more about Peter Singer as the movement's founder. By contrast the impression I get about EA's founding based on what I know is that EA started with Yudkowsky and then MacAskill, with Peter Singer only getting involved later. Is my impression mistaken?

top 12 comments
sorted by: hot top controversial new old
[–] [email protected] 13 points 11 months ago (3 children)

Eh, the impression that I get here is that Eliezer happened to put "effective" and "altruist" together without intending to use them as a new term. This is Yud we're talking about - he's written roughly 500,000 more words about Harry Potter than the average person does in their lifetime.

Even if he had invented the term, I wouldn't say this is a smoking gun of how intertwined EAs are with the LW rats - there's much better evidence out there.

[–] [email protected] 9 points 11 months ago

Oh my god. I should have realised of course teenage Yud would be Like This, but looking through this archive is a trip.

[–] [email protected] 7 points 11 months ago

He has quite possibly written more words about Harry Potter than She Who Shall Not Be Named, herself.

[–] [email protected] 7 points 11 months ago

Thank you, that link is exactly what I was looking for (and also sated my curiosity about how Yudkowsky got involved with Bostrom and Hanson, I had heard they met on the extropian listserv but I had never seen any proof).

[–] [email protected] 12 points 11 months ago* (last edited 11 months ago)

EA as a movement was a combination of a few different groups (This account says Giving what we can/80000 hours, Givewell, and yudkowsky's MIRI). However, the main source of early influx of people was the rationalist movement, as Yud had heavily promoted EA-style ideas in the sequences.

So if you look at surveys, right now a a relatively small percentage (like 15%) of EA's first heard about it through lesswrong or SSC. But back in 2014, and earlier, Lesswrong was the number one onroad into the movement (like 30%) . (I'm sure a bunch of the other answers may have heard about it from rationalist friends as well). I think it would have been even more if you go back earlier.

Nowadays, most of the recruiting is independent from the rationalists, so you have a bunch of people coming in and being like, what's with all the weird shit? However they still adopt a ton of rationalist ideas and language, and the EA forum is run by the same people as Lesswrong. It leads to some tension: someone wrote a post saying that "yudkowsky is frequently confidently, egregiousl wrong", and it was somewhat upvoted on EA forum but massively downvoted on Lesswrong.

[–] [email protected] 11 points 11 months ago

downplaying the AI and longtermist elements within the movement

and this has always been an issue - they're a fucking embarrassment, but they also do a lot of the organisational work so it's hard to get rid of them

[–] [email protected] 11 points 11 months ago

yeah, he totally did. EAs claiming otherwise are just incorrect.

remember that the original Roko's Basilisk post talked about the dilemma of being an "altruist", i.e. how to donate as much money as possible to MIRI (or SIAI as it was in 2010).

the terms were in extremely heavy use back then.

a bunch of the various Singer-inspired groups tried to work out a name for the whole thing, and they picked Yudkowsky's coinage.

[–] [email protected] 8 points 11 months ago

From what I remember, EA was always popularized by rationalist culture for like two decades.

[–] [email protected] 8 points 11 months ago

This is gonna be really helpful next time someone tells me straight up that EA and Rationalist are totally different things and just overlap by coincidence.

[–] [email protected] 6 points 11 months ago (2 children)

Speaking of Big Yud, has there been any new developments with him since the fallout of his Time Magazine article in March? Not like there's been any international coalition to first strike rogue data centers or whatever. Everyone just kind of ignored him?

[–] [email protected] 5 points 11 months ago

He's been doing interviews on podcasts. The NYT also recently listed "internet philosopher" Eliezer Yudkowsky as one of the key figures of the modern artificial intelligence movement.

Sadly he did not wear a fedora in his official NYT picture

[–] [email protected] 4 points 11 months ago

Well you need motive and the ability to follow up on your threats(*) for it to be a real threat and he certainly lacks the latter. We also have a tendency to ignore the people who have the former while they do not have the latter until it is too late. (Some of my recent Dutch elections doomerism might be influencing this message).

*: if you are privileged in society.