Architeuthis
but it can make a human way more efficient, and make 1 human able to do the work of 3-5 humans.
Not if you have to proof-read everything to spot the entirely convincing-looking but completely inaccurate parts, is the problem the article cites.
I’m truly surprised they didn’t cart Yud out for this shit
Self-proclaimed sexual sadist Yud is probably a sex scandal time bomb and really not ready for prime time. Plus it's not like he has anything of substance to add on top of Saltman's alarmist bullshit, so it would just be reminding people how weird in a bad way people in this subculture tend to be.
I liked how Scalzi brushed it away, basically your consciousness gets copied to a new body, which kills the old one, and an artifact of the transfer process is that for a few moments you experience yourself as a mind with two bodies, meaning you have at least the impression of continuity of self, which is enough for most people to get on with living in a new body and let philosophers do the worrying.
I feel like a subset of sci-fi and philosophical meandering really is just increasingly convoluted paths of trying to avoid or come to terms with death as a possibly necessary component of life.
Given rationalism's intellectual heritage, this is absolutely transhumanist cope for people who were counting on some sort of digital personhood upload as a last resort to immortality in their lifetimes.
You mean swapped out with something that has feelings that can be hurt by mean language? Wouldn't that be something.
Are we putting endocrine systems in LLMs now?
Archive the weights of the models we build today, so we can rebuild them in the future if we need to recompense them for moral harms.
To be clear, this means that if you treat someone like shit all their life, saying you're sorry to their Sufficiently Similar Simulation™ like a hundred years after they are dead makes it ok.
This must be one of the most blatantly supernatural rationalist Accepted Truths, that if your simulation is of sufficiently high fidelity you will share some ontology of self with it, which by the way is how the basilisk can torture you even if you've been dead for centuries.
IQ test performance correlates with level of education
I read somewhere that this claim owes a little too much to the inclusion of pathological cases at the lower end of the spectrum, meaning that since below a certain score like 85 you are basically intellectually disabled (or even literally brain dead, or just dead) and academic achievement becomes nonexistent, the correlation is far more pronounced than if we were comparing educational attainment at the more functional ranges.
Will post source if I find it.
Yeah but like national socialist power metal isn't a thing in the way nsbm is.
I wonder if it's primarily occultism's nazi problem metastasizing, foundational dorks like Vikernes notwithstanding.
The whole article is sneertastic. Nothing to add, will be sharing.
What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.
eugenic TESCREAL screed (an acronym for … oh, never mind).
“Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”
Others in San Francisco are calling it “The Nerd Reich.”
“I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”
Next time Lars Ulrich sues you you'll be able to say you needed the Some Kind of Monster mp3s for AI research. It's foolproof.
Someone posted this to /r/SneerClub and it got 150+ comments, didn't realize you can still start threads there.