this post was submitted on 13 Nov 2023
192 points (93.2% liked)
Comic Strips
12475 readers
4923 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In personally trying to use ChatGPT 4 for a job task (programming), I would disagree strongly with this sentiment. I have yet to find a task where it doesn't partially fail due to no notion of the concepts underlying the topic.
In an example, I asked it to write an implementation of reading from a well known file type as a class. It had many correct ideas for certain operations (compiled from other sources of course), but failed with the basic concept of class instantiation. It was calling class methods in the constructor, which is just not allowed in the language being used. I went through several iterations with it to avail before just giving up on it.
In "normal" language tasks, it seems to be quirky, but passable. But if you give it a highly technical task where nuance and conceptual knowledge are needed? I have yet to see that work in any reliable capacity.
I use it for programming a lot too. You have to explain everything to it like you would a brand new engineer, and then it is often wrong with certain parts like you said. But if you know enough about coding to figure out where it's wrong, and just write those parts yourself, it can still be a huge time saver.
Yeah, I'd agree that with sufficient iterations and clarifying remarks ChatGPT can produce something close to functional. I was mostly disagreeing with the original comment's sentiment that it could be treated like the computer on the Enterprise. While they had several plot specific flaws, the duotronic computers were generally competent and didn't need everything spelled out for them.