this post was submitted on 19 Dec 2024
-1 points (44.4% liked)

Ask Science

8753 readers
18 users here now

Ask a science question, get a science answer.


Community Rules


Rule 1: Be respectful and inclusive.Treat others with respect, and maintain a positive atmosphere.


Rule 2: No harassment, hate speech, bigotry, or trolling.Avoid any form of harassment, hate speech, bigotry, or offensive behavior.


Rule 3: Engage in constructive discussions.Contribute to meaningful and constructive discussions that enhance scientific understanding.


Rule 4: No AI-generated answers.Strictly prohibit the use of AI-generated answers. Providing answers generated by AI systems is not allowed and may result in a ban.


Rule 5: Follow guidelines and moderators' instructions.Adhere to community guidelines and comply with instructions given by moderators.


Rule 6: Use appropriate language and tone.Communicate using suitable language and maintain a professional and respectful tone.


Rule 7: Report violations.Report any violations of the community rules to the moderators for appropriate action.


Rule 8: Foster a continuous learning environment.Encourage a continuous learning environment where members can share knowledge and engage in scientific discussions.


Rule 9: Source required for answers.Provide credible sources for answers. Failure to include a source may result in the removal of the answer to ensure information reliability.


By adhering to these rules, we create a welcoming and informative environment where science-related questions receive accurate and credible answers. Thank you for your cooperation in making the Ask Science community a valuable resource for scientific knowledge.

We retain the discretion to modify the rules as we deem necessary.


founded 2 years ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] savvywolf@pawb.social 2 points 7 hours ago

"Hate" and "Love" are complex things that are very ingrained into human nature. For an AI or robot to have these things, it would essentially have to emulate or implement a human mind. Such a thing is currently very far beyond our current technology level, and arguably isn't even a goal of many AI projects. Most AI systems like ChatGPT are basically glorified autocomplete. They are given an input and use complex maths and probabilities to predict what a human would respond to it. They don't have any understanding about what they are talking about.

I think if an AI were able to hate or love, it would raise complicated and perhaps uncomfortable questions about what it means to be "human". Can a system that perfectly replicates human emotions and experiences, not be considered human itself?

[–] DirigibleProtein@aussie.zone 0 points 4 hours ago

Kills nazi’s what?

I'm not sure if we understand "hate" and "love" well enough in biological brains to the point where we would be able to replicate the emotions with transistors and be confident that we had succeeded.

[–] Bougie_Birdie@lemmy.blahaj.zone 1 points 11 hours ago

You can't teach a computer to feel because computers lack the hardware for emotion.

You might be able to emulate a feeling in a convincing way. This means that computers will always be somewhat sociopathic.

So to your question, I think that means they can mirror their inputs or fake feelings according to their programming

[–] Anticorp@lemmy.world 1 points 11 hours ago

They mimic the inputs. Microsoft made a chatbot a few years ago named Tay who turned into a hateful Nazi in less than 24 hours because Microsoft didn't install any safeguards around the type of inputs it received. The program was scrapped almost immediately.