this post was submitted on 24 Jul 2024
125 points (100.0% liked)

TechTakes

1427 readers
132 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 24 points 4 months ago* (last edited 4 months ago) (18 children)

Yann and co. just dropped llama 3.1. Now there's an open source model on par with OAI and Anthropic, so who the hell is going to pay these nutjobs for access to their apis when people can get roughly the same quality for free without the risk of having to give your data to a 3rd party?

These chuckle fucks are cooked.

[–] [email protected] 22 points 4 months ago (17 children)

For "free" except you need thousands of dollars upfront for hardware and a full hardware/software stack you need to maintain.

This is like saying azure is cooked because you can rack mount your own PC

[–] [email protected] 10 points 4 months ago (9 children)

That's mostly true. But if you have a GPU to play video games on a PC running Linux, you can easily use Ollama and run llama 3 with 7 billion parameters locally without any real overhead.

[–] [email protected] 6 points 4 months ago (2 children)

Azure/AWS/other cloud computing services that host these models are absolutely going to continue to make money hand over fist. But if the bottleneck is the infrastructure, then what's the point of paying an entire team of engineers 650K a year each to recreate a model that's qualitatively equivalent to an open-source model?

[–] [email protected] 4 points 4 months ago

For me, the bottleneck is my data. I want to keep my data. And honestly I don't know why any entity is OK with sharing their data for some small productivity improvements. But I don't understand a lot.

[–] [email protected] 3 points 4 months ago* (last edited 4 months ago) (1 children)

The engineers can generally also do other things, the security will likely be better, and its fully possible API costs will exceed that sum if you need that much expertise inhouse to match your API usage.

[–] [email protected] 6 points 4 months ago

The engineers can generally also do other things

What's the job posting for that going to look like, LLM stack maintainer wanted, must also be accomplished front end developer in case things get slow?

load more comments (6 replies)
load more comments (13 replies)
load more comments (13 replies)