this post was submitted on 05 Sep 2023
35 points (92.7% liked)

Privacy

31876 readers
453 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago (1 children)

I’m a bit out of the loop with LLMs but it depends on what you’re doing.

Last I heard, you’re going to want to use a 65b or 70b model if you want something that runs as good as GPT 3.5 but good luck with getting a GPU with enough VRAM to hold it without breaking the bank. You could offload layers to system RAM or even swap but that can come with pretty steep performance implications.

I haven’t heard of a model that’s comparable to GPT 4 but like I said, I’m pretty out of the loop. But, you’d still probably have the same VRAM and performance issues but even worse since bigger models usually is better.

All that being said, you might not need some huge model depending on what you’re doing. There’s some smaller models that can fit on consumer GPUs that can perform surprisingly well in certain situations. There’s also uncensored variants of models that won’t give you some moral lecture if you ask it for something questionable. Then there’s also the privacy aspect; I absolutely would not trust OpenAI with any personal information. I believe there’s a way to opt out of them using your personal data for training for personal accounts but you’re still trusting them with whatever information you send them.

[–] [email protected] 1 points 1 year ago

I'm personally hoping the hardware mismatch issues will sort themselves out in a few years and I can wait to upgrade my GPU then.