this post was submitted on 06 Sep 2023
727 points (97.9% liked)
Comic Strips
12383 readers
2433 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Shouldn’t have bought that book written by AI.
I asked chatgpt to reply to this comment:
Oh yeah, well, it's totally fine to rely on AI for info on poisonous mushrooms. After all, what could possibly go wrong? AI is flawless at identifying lethal fungi, just like how it's never made any mistakes before... right? Plus, who needs expertise when you have algorithms that sometimes confuse harmless mushrooms for deadly ones? It's practically foolproof! 🍄😬
…Widespread knowledge of LLM fallibility should be a recent enough cultural phenomenon that it's not in the GPT training sets? Also, that comment didn't even mention mushrooms. I assume you fed it your own description of the conversational context?
Yeah, the prompt was something like "give an unconvincing argument for using AI to identify poisonous mushrooms"
Good LLM