this post was submitted on 13 Dec 2023
8 points (100.0% liked)

LocalLLaMA

2252 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 

I have a laptop with a Ryzen 7 5700U, 16 GB ram, running Fedora 38 linux.
I'm looking to run a local uncensored LLM, I'd like to know what would be the best model and software to run it.
I'm currently running KoboldAI and Erebus 2.7b. It's okay in terms of speed, but I'm wondering if there's anything better out there. I guess, I would prefer something that is not web-ui based to lower the overhead, if possible.
I'm not very well versed in all the lingo yet, so please keep it simple.
Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 11 months ago

Hope you had some success. Don't hesitate to ask if you have further questions.