this post was submitted on 03 Jun 2024
1299 points (96.5% liked)
Technology
59381 readers
4116 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Mate, all it does is predict the next word or phrase. It doesn't know what you're trying to do or have any ethics. When it fucks up it's going to be your fuckup and since you relied on the bot rather than learned to do it yourself you're not going to be able to fix it.
I understand how it works, but that's irrelevant if it does work as a tool in my toolkit. I'm also not relying on the LLM, I'm taking it with a massive grain of salt. It usually gets most of the way there, and I have to fix issues or have it revise the code. For simple stuff that'd be busy work for me, it does pretty well.
It would be my fuck up if it fucks up, and I don't catch it. I'm not putting code it writes directly into production, I'm not stupid.