21
submitted 8 months ago by [email protected] to c/[email protected]

I have an rx 6600 and 16gb of ram and an i5 10400f

I am using oobabooga web-ui and I happened to have a gguf file of LLama2-13B-Tiefighter.Q4_K_S .

But it always says that the connection errored out when I load the model.

Anyway, please suggest any good model that I can get started with.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 3 points 8 months ago

Hey thanks ! I'll check these out.

this post was submitted on 25 Jan 2024
21 points (100.0% liked)

LocalLLaMA

2213 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS