this post was submitted on 09 Jan 2024
14 points (100.0% liked)
SneerClub
1012 readers
26 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Perhaps. The problem of human flight was “solved” by the development of large, unwieldy machines driven by (relatively speaking, cf. pigeons) highly inefficient propulsion systems which are very good at covering long distances, oceans, and rough terrain quickly - the aim was Daedalus and Icarus, but aerospace companies are fortunate that the flying machine turned out to have advantages in strictly commercial and military use. It’s completely undecided physically whether there is a solution to the problem of building human-like intelligence which does a comparable job to having sex, even with complete information about the workings of humans.
Yes, and ultimately this question, of what gets built, as opposed to what is knowable, is an economics question. The energy gradients available to a bird are qualitatively different than those available to industry, or individual humans. Of course they are!
There's no theoritical limit to how close an universal function approximator can get to a closed system definition of something. Bird's flight isn't magic, or unknowable, or non reproduceable. If it was, we'd have no sense of awe at learning about it, studying it. Imagine if human like behavior of intelligence was completely unknowable. How would we go about teaching things? Communicating at all? Sharing our experiences?
But in the end, it's not just the knowledge of a thing that matters. It's the whole economics of that thing embedded in its environment.
I guess I violently agree with the observation, but I also take care not to put humanity, or intelligence in a broad sense, in some special magical untouchable place, either. I feel it can be just as reductionist in the end to demand there is no solution than to say that any solution has its trade offs and costs.
While I agree with you about the economics, I’m trying to point out that physical reality also has constraints other than economic, many of them unknown, some of them discovered in the process of development.
No. But it is unreproducible if you already have arms with shoulders, elbows, hands, and five stubby fingers. Human and bird bodies are sufficiently different that there are no close approximations for humans which will reproduce flight for humans as it is found in birds.
To me, this is a series of non-sequiturs. It’s obvious that you can have awe for something without having a genuine understanding of it, but that’s beside the point. Similarly, the kind of knowledge required for humans to communicate with one another isn’t relevant - what we want to know is the kind of knowledge which goes into the physical task of making artificial humans. And you ride roughshod of one of the most interesting aspects of the human experience: human communication and mutual understanding is possible across vast gulfs of the unknown, which is itself rather beautiful.
But again I can’t work out what makes that particularly relevant. I think there’s a clue here though:
Right, but this would be a common (and mistaken) move some people make which I’m not making, and which I have no desire to make. You’re replying here to people who affirm either an implicit or explicit dualism about human consciousness, and say that the answers to some questions are just out of reach forever. I’m not one of those people, and I’m referring specifically to the words I used to make the point that I made, namely that there exist real physical constraints repeatedly approached and arrived at in the history of technology which demonstrate that not every problem has an ideal solution (and I refer you back to my earlier point about aircraft to show how that cashes out in practice).
For what it's worth then, I don't think we're in disagreement, so I just want to clarify a couple of things.
When I say open system economics, I mean from an ecological point of view, not just the pay dollars for product point of view. Strictly speaking, there is some theoritical price and a process, however gruesome, that could force a human into the embodiment of a bird. But from an ecosystems point of view, it begs the obvious question; why? Maybe there is an answer to why that would happen, but it's not a question of knowledge of a thing, or even the process of doing it, it's the economic question in the whole.
The same thing applies to human intelligence, however we plan to define it. Nature is already full of systems that have memory, that can abstract, reason, that can use tools, that are social, that are robust in the face of novel environments. We are unique but not due to any particular capability, we're unique because of the economics and our relationship with all the other things we depend upon. I think that's awesome!
I only made my comment to caution though, because yes, I do think that overall people still put humanity and our intelligence on a pedestal, and I think that plays to rationalist hands. I love being human and the human experience. I also love being alive, and part of nature, and the experience of the ecosystem as a whole. From that perspective, it would be hard for me to believe that any particulart part of human intelligence can't be reproduced with technology, because to me it's already abundant in nature. The question for me, and our ecosystem at large, is when it does occur,
what's the cost? What role, will it have? What regulations, does it warrant? What, other behaviors will it exhibit? And also, I'm ok not being in control of those answers. I can just live, in a certain degree of uncertainty.