this post was submitted on 22 Dec 2024
521 points (96.1% liked)

Technology

60101 readers
2043 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 28 points 4 days ago (1 children)

Models are not improving, companies are still largely (massively) unprofitable, the tech has a very high environmental impact (and demand) and not a solid business case has been found so far (despite very large investments) after 2 years.

That AI isn't going anywhere is possible, but LLM-based tools might also simply follow crypto, VR, metaverses and the other tech "revolutions" that were just hyped and that ended nowhere. I can't say it will go one way or another, but I disagree with you about "adjustment period". I think generative AI is cool and fun, but it's a toy. If companies don't make money with it, they will eventually stop investing into it.

Also

Today’s hype will have lasting effects that constrain tomorrow’s possibilities

Is absolutely true. Wasting capital (human and economic) on something means that it won't be used for something else instead. This is especially true now that it's so hard to get investments for any other business. If all the money right now goes into AI, and IF this turns out to be just hype, we just collectively lost 2, 4, 10 years of research and investments on other areas (for example, environment protection). I am really curious about what makes you think that that sentence is false and stupid.

[–] [email protected] 5 points 4 days ago (1 children)

Models are not improving? Since when? Last week? Newer models have been scoring higher and higher in both objective and subjective blind tests consistently. This sounds like the kind of delusional anti-AI shit that the OP was talking about. I mean, holy shit, to try to pass off "models aren't improving" with a straight face.

[–] [email protected] 9 points 3 days ago (2 children)

There is a bunch of research showing that model improvement is marginal compared to energy demand and/or amount of training data. OpenAI itself ~1 month ago mentioned that they are seeing a smaller improvements in Orion (I believe) vs GPT4 than there was between GPT 4 and 3. We are also running out of quality data to use for training.

Essentially what I mean is that the big improvements we have seen in the past seem to be over, now improving a little cost a lot. Considering that the costs are exorbitant and the difference small enough, it's not impossible to imagine that companies will eventually give up if they can't monetize this stuff.

[–] [email protected] 3 points 2 days ago (1 children)

Compare Llama 1 to the current state of the art local AI's. They're on a completely different level.

[–] [email protected] 2 points 2 days ago

Yes, because at the beginning there was tons of room for improvement.

I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it's costing a fortune and taking forever. Logarithmic curve, it seems. Also if we run out of data to train, that's it.

[–] [email protected] 2 points 2 days ago (1 children)

Surely you can see there is a difference between marginal improvement with respect to energy and not improving.

[–] [email protected] 1 points 2 days ago

Yes, I see the difference as in hitting the logarithmic tail that shows we are close to the limit. I also realize that exponential cost is a defacto limit on improvement. If improving again for chatGPT7 will cost 10 trillions, I don't think it will ever happen, right?