this post was submitted on 23 Jul 2024
41 points (90.2% liked)
LocalLLaMA
2250 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
the code is FOSS, the weights aren't, this is pretty common with e.g. FOSS games, the only difference here is weights are much costlier to remake from scratch than game assets
The license has limitations and isn't something standard like Apache
True, but it hardly matters for the source since the architecture is pulled into open source projects like transformers (Apache) and llama.cpp (MIT). The weights remain under the dubious Llama Community License, so I would only call the data “available” instead of “open”.
I'll just stick to Mistral
Are you using mistral 7B?
I also really like that model and their fine-tunes. If licensing is a concern, it's definitely a great choice.
Mistral also has a new model, Mistral Nemo. I haven't tried it myself, but I heard it's quite good. It's also licensed under Apache 2.0 as far as I know.
Is it part of ollama?
Edit: https://ollama.com/library/mistral-nemo
Yes, you can find it here.