this post was submitted on 18 Aug 2023
101 points (94.7% liked)
Asklemmy
43946 readers
684 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Maybe historically. But the more I used or got exposed to chatGPT the more I realised how bad it still is. It certainly won't be coming for my job, even in the next 20 years, I'm sure of that. It's a bit like nuclear fusion. It's been ready for use in the next 25 years for like 75 years.
If you're using ChatGPT, you're using an outdated model. GPT-4 is better than it and has been out for some time.
ChatGPT is weaker and not something I'd use frequently. GPT-4 is much stronger and much more useful. And the next generation is coming soon, which will be better than GPT-4.
It's not like fusion in its current state since LLM AIs are already ready for use. The only task is making it more effective. Using the fusion example, it would be like if we'd finally developed a reactor that generates more energy than it consumed, and now only sought to make it create more power