this post was submitted on 25 Dec 2023
1923 points (97.9% liked)
People Twitter
5390 readers
2057 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a tweet or similar
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's not like that though. Newer phones are going to have dedicated hardware for processing neural platforms, LLMs, and other generative tools. The dedicated hardware will make these processes just barely sip the battery life.
wrong.
if that existed, all those AI server farms wouldn't be so necessary, would they?
dedicated hardware for that already exists, it definitely isn't gonna be able to fit a sizeable model on a phone any time soon. models themselves require multiple tens of gigabytes of storage space. you won't be able to fit more than a handful on even a 512gb internal storage. the phones can't hit the ram required for these models at all. and the dedicated hardware still requires a lot more power than a tiny phone battery.
Those server farms are because the needs of corporations might just be different from the needs of regular users.
I'm running a 8 GB LLM model locally on my PC that performs better than 16 GB models from just a few months ago.
It's almost as if technology can get better and more efficient over time.