TinyTimmyTokyo

joined 1 year ago
[–] [email protected] 16 points 5 months ago

Until a month ago, TW was the long-time researcher for "Blocked and Reported", the podcast hosted by Katie 'TERF' Herzog and relentless sealion Jesse Singal.

[–] [email protected] -1 points 7 months ago (7 children)

I wouldn't know anything about the thread, as it's impossible for me to read without a twitter account. Yet another reason why the site is trash.

But by all means, go on generating content for a bigoted fascist who will use everything you write to increase engagement with his platform and give it undeserved credibility and $$$.

(And no, I won't block you.)

[–] [email protected] 4 points 7 months ago (9 children)

Sorry for the off-topic rant, but WTF is Emile Torres doing on twitter? Anytime I see someone creating content for that Nazi hellsite, I start looking at them differently.

[–] [email protected] 5 points 7 months ago

I'd really like to know the back story on this interview too. I realize weirdness isn't exactly distinctive when it comes to rationalists, but Zack is in a league of his own.

[–] [email protected] 5 points 8 months ago (1 children)

I've been using DigitalOcean for years as a personal VPS box, and I've had no complaints. Not sure how well they'd scale (in terms of cost) for a site like this.

[–] [email protected] 10 points 8 months ago (5 children)

Anthropic's Claude confidently and incorrectly diagnoses brain cancer based on an MRI.

[–] [email protected] 12 points 8 months ago (6 children)

Strange man posts strange thing.

[–] [email protected] 13 points 8 months ago (1 children)

This linked interview of Brian Merchant by Adam Conover is great. I highly recommend watching the whole thing.

For example, here is Adam, decribing the actual reasons why striking writers were concerned about AI, followed by Brian explaining how Sam Altman et al hype up the existential risk they themselves claim to be creating, just so they can sell themselves as the solution. Lots of really edifying stuff in this interview.

[–] [email protected] 9 points 8 months ago (1 children)

She really is insufferable. If you've ever listened to her Pivot podcast (do not advise), you'll be confronted by the superficiality and banality of her hot takes. Of couse this assumes you're able to penetrate the word salad she regularly uses to convey any point she's trying to make. She is not a good verbal communicator.

Her co-host, "Professor" [*] Scott Galloway, isn't much better. While more verbally articulate, his dick joke-laden takes are often even more insufferable than Swisher's. I'm pretty sure Kara sourced from him her opinion that you should "use AI or be run over by progress"; it's one of his most frequent hot takes. He's also one of the biggest tech hype maniacs, so of course he's bought a ticket on the AI hype express. Before the latest AI boom, he was a crypto booster, although he's totally memory-holed that phase of his life now that the crypto hype train has run off a cliff.

[*] I put professor in quotes, because he's one of those people who insist on using a title that is equal parts misleading and pretentious. He doesn't have a doctorate in anything, and while he's technically employed by NYU's business school, he's a non-tenured "clinical professor", which is pretty much the same as an adjunct. Nothing against adjunct professors, but most adjuncts I've known don't go around insisting that you call them "professor" in every social interaction. It's kind of like when Ph.D.s insist you call them "doctor".

[–] [email protected] 10 points 8 months ago

I wonder what percentage of fraudulent AI-generated papers would be discovered simply by searching for sentences that begin with "Certainly, ..."

[–] [email protected] 10 points 8 months ago (1 children)

I'm probably not saying anything you didn't already know, but Vox's "Future Perfect" section, of which this article is a part, was explicitly founded as a booster for effective altruism. They've also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it's just taken for granted that he's some kind of visionary genius.

 

[All non-sneerclub links below are archive.today links]

Diego Caleiro, who popped up on my radar after he commiserated with Roko's latest in a never-ending stream of denials that he's a sex pest, is worthy of a few sneers.

For example, he thinks Yud is the bestest, most awesomest, coolest person to ever breathe:

Yudkwosky is a genius and one of the best people in history. Not only he tried to save us by writing things unimaginably ahead of their time like LOGI. But he kind of invented Lesswrong. Wrote the sequences to train all of us mere mortals with 140-160IQs to think better. Then, not satisfied, he wrote Harry Potter and the Methods of Rationality to get the new generation to come play. And he founded the Singularity Institute, which became Miri. It is no overstatement that if we had pulled this off Eliezer could have been THE most important person in the history of the universe.

As you can see, he's really into superlatives. And Jordan Peterson:

Jordan is an intellectual titan who explores personality development and mythology using an evolutionary and neuroscientific lenses. He sifted through all the mythical and religious narratives, as well as the continental psychoanalysis and developmental psychology so you and I don’t have to.

At Burning Man, he dons a 7-year old alter ego named "Evergreen". Perhaps he has an infantilization fetish like Elon Musk:

Evergreen exists ephemerally during Burning Man. He is 7 days old and still in a very exploratory stage of life.

As he hinted in his tweet to Roko, he has an enlightened view about women and gender:

Men were once useful to protect women and children from strangers, and to bring home the bacon. Now the supermarket brings the bacon, and women can make enough money to raise kids, which again, they like more in the early years. So men have become useless.

And:

That leaves us with, you guessed, a metric ton of men who are no longer in families.

Yep, I guessed about 12 men.

 

Excerpt:

Richard Hanania, a visiting scholar at the University of Texas, used the pen name “Richard Hoste” in the early 2010s to write articles where he identified himself as a “race realist.” He expressed support for eugenics and the forced sterilization of “low IQ” people, who he argued were most often Black. He opposed “miscegenation” and “race-mixing.” And once, while arguing that Black people cannot govern themselves, he cited the neo-Nazi author of “The Turner Diaries,” the infamous novel that celebrates a future race war.

He's also a big eugenics supporter:

“There doesn’t seem to be a way to deal with low IQ breeding that doesn’t include coercion,” he wrote in a 2010 article for AlternativeRight .com. “Perhaps charities could be formed which paid those in the 70-85 range to be sterilized, but what to do with those below 70 who legally can’t even give consent and have a higher birthrate than the general population? In the same way we lock up criminals and the mentally ill in the interests of society at large, one could argue that we could on the exact same principle sterilize those who are bound to harm future generations through giving birth.”

(Reminds me a lot of the things Scott Siskind has written in the past.)

Some people who have been friendly with Hanania:

  • Mark Andreessen, Silion Valley VC and co-founder of Andreessen-Horowitz
  • Hamish McKenzie, CEO of Substack
  • Elon Musk, Chief Enshittification Officer of Tesla and Twitter
  • Tyler Cowen, libertarian econ blogger and George Mason University prof
  • J.D. Vance, US Senator from Ohio
  • Steve Sailer, race (pseudo)science promoter and all-around bigot
  • Amy Wax, racist law professor at UPenn.
  • Christopher Rufo, right-wing agitator and architect of many of Florida governor Ron DeSantis's culture war efforts
 

Ugh.

But even if some of Yudkowsky’s allies don’t entirely buy his regular predictions of AI doom, they argue his motives are altruistic and that for all his hyperbole, he’s worth hearing out.

view more: ‹ prev next ›