(smashes imaginary intercom button) "Who is this 'some guy'? Find him and find out what he knows!!"
Happy belated birthday!
Elon Musk in the replies:
Have you read Asimov’s Foundation books?
They pose an interesting question: if you knew a dark age was coming, what actions would you take to preserve knowledge and minimize the length of the dark age?
For humanity, a city on Mars. Terminus.
Isaac Asimov:
I'm a New Deal Democrat who believes in soaking the rich, even when I'm the rich.
(From a 1968 letter quoted in Yours, Isaac Asimov.)
Lex Fridman: "I'm going to do a deep dive on Ancient Rome. Turns out it was a land of contrasts"
I'm doing a podcast episode on the Roman Empire.
It's a deep dive into military conquest, technology, politics, economics, religion... from its rise to its collapse (n the west & the east).
History really does put everything in perspective.
(xcancel)
... "Coming of Age" also, oddly, describes another form of novel cognitive dissonance; encountering people who did not think Eliezer was the most intelligent person they had ever met, and then, more shocking yet, personally encountering people who seemed possibly more intelligent than himself.
The latter link is to "Competent Elities", a.k.a., "Yud fails to recognize that cocaine is a helluva drug".
I've met Jurvetson a few times. After the first I texted a friend: “Every other time I’ve met a VC I walked away thinking ‘Wow, I and all my friends are smarter than you.’ This time it was ‘Wow, you are smarter than me and all my friends.’“
Uh-huh.
Quick, to the Bat-Wikipedia:
On November 13, 2017, Jurvetson stepped down from his role at DFJ Venture Capital in addition to taking leave from the boards of SpaceX and Tesla following an internal DFJ investigation into allegations of sexual harassment.
Not smart enough to keep his dick in his pants, apparently.
Then, from 2006 to 2009, in what can be interpreted as an attempt to discover how his younger self made such a terrible mistake, and to avoid doing so again, Eliezer writes the 600,000 words of his Sequences, by blogging “almost daily, on the subjects of epistemology, language, cognitive biases, decision-making, quantum mechanics, metaethics, and artificial intelligence”
Or, in short, cult shit.
Between his Sequences and his Harry Potter fanfic, come 2015, Eliezer had promulgated his personal framework of rational thought — which was, as he put it, “about forming true beliefs and making decisions that help you win” — with extraordinary success. All the pieces seemed in place to foster a cohort of bright people who would overcome their unconscious biases, adjust their mindsets to consistently distinguish truth from falseness, and become effective thinkers who could build a better world ... and maybe save it from the scourge of runaway AI.
Which is why what happened next, explored in tomorrow’s chapter — the demons, the cults, the hells, the suicides — was, and is, so shocking.
Or not. See above, RE: cult shit.
Something tells me they’re not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it’s a scam otherwise.
If people with money had that much good sense, the world would be a well-nigh unfathomably different place....
I actually don’t get the general hate for AI here.
Try harder.
We have had readily available video communication for over a decade.
We've been using "video communication" to teach for half a century at least; Open University enrolled students in 1970. All the advantages of editing together the best performances from a top-notch professor, moving beyond the blackboard to animation, etc., etc., were obvious in the 1980s when Caltech did exactly that and made a whole TV series to teach physics students and, even more importantly, their teachers. Adding a new technology that spouts bullshit without regard to factual accuracy is necessarily, inevitably, a backward step.
AI can directly and individually address that frustration and find a solution.
No, it can't.
Another thing I turned up and that I need to post here so I can close that browser tab and expunge the stain from my being: Yud's advice about awesome characters.
I find that fiction writing in general is easier for me when the characters I’m working with are awesome.
The important thing for any writer is to never challenge oneself. The Path of Least Resistance(TM)!
The most important lesson I learned from reading Shinji and Warhammer 40K
What is the superlative of "read a second book"?
Awesome characters are just more fun to write about, more fun to read, and you’re rarely at a loss to figure out how they can react in a story-suitable way to any situation you throw at them.
"My imagination has not yet descended."
Let’s say the cognitive skill you intend to convey to your readers (you’re going to put the readers through vicarious experiences that make them stronger, right? no? why are you bothering to write?)
In college, I wrote a sonnet to a young woman in the afternoon and joined her in a threesome that night.
You’ve set yourself up to start with a weaksauce non-awesome character. Your premise requires that she be weak, and break down and cry.
“Can’t I show her developing into someone who isn’t weak?" No, because I stopped reading on the first page. You haven’t given me anyone I want to sympathize with, and unless I have some special reason to trust you, I don’t know she’s going to be awesome later.
Holding fast through the pain induced by the rank superficiality, we might just find a lesson here. Many fans of Harry Potter have had to cope, in their own personal ways, with the stories aging badly or becoming difficult to enjoy. But nothing that Rowling does can perturb Yudkowsky, because he held the stories in contempt all along.
I have to admit that I wasn't expecting LinkedIn to become a wretched hive of "quantum" bullshit, but hey, here we are.
Tangentially: Schrödinger is a one-man argument for not naming ideas after people.