this post was submitted on 25 Jul 2023
188 points (95.6% liked)
Asklemmy
43742 readers
1427 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Consciousness is not based on memory or else computers would be considered conscious.
And if according to what you're saying, a clone with all of your memories would mean you have two points of view. I could take your clone into a different room and you'd be able to tell me what they see. But it obviously wouldn't work like that because your own sense of self would still be locked in your head and the clone would get its own sense of self, albeit one with the same memories.
What i meant is, memory plays a key role.
Consciousness is, simplified, a set of self-feeding loops over input and memory, with emotions and attention (Amygdala) as regulatory mechanism.
And what we consider as consciosness only exists because of short-term memory snd our vast mental capabilities. Arguably, every higher animal has a sort of consciousness, just far more limitted. And maybe a more limited set of regulators (memories), because of our societal nature.
No, the input is not shared between two beings, even if there are two of the same.
Exactly. But because he has the same body, same memories and same feelings, he is you. Which would change with time if the original you is not deconstructed, because the "you" of today is not the "you" of yesterday because of memories, genexpression, yadda yadda.
There is no reason what you describe should give rise to consciousness rather than a biological artificial intelligence. The sense of self, the perspective that feels like me peering out through my eyes, is not explained by anything you said.
A copy of me does not equal me because we'd both have separate senses of self. Having copied memories does nothing to affect that.