this post was submitted on 15 Feb 2024
29 points (87.2% liked)
Autism
6864 readers
16 users here now
A community for respectful discussion and memes related to autism acceptance. All neurotypes are welcome.
We have created our own instance! Visit Autism Place the following community for more info.
Community:
Values
- Acceptance
- Openness
- Understanding
- Equality
- Reciprocity
- Mutuality
- Love
Rules
- No abusive, derogatory, or offensive post/comments e.g: racism, sexism, religious hatred, homophobia, gatekeeping, trolling.
- Posts must be related to autism, off-topic discussions happen in the matrix chat.
- Your posts must include a text body. It doesn't have to be long, it just needs to be descriptive.
- Do not request donations.
- Be respectful in discussions.
- Do not post misinformation.
- Mark NSFW content accordingly.
- Do not promote Autism Speaks.
- General Lemmy World rules.
Encouraged
- Open acceptance of all autism levels as a respectable neurotype.
- Funny memes.
- Respectful venting.
- Describe posts of pictures/memes using text in the body for our visually impaired users.
- Welcoming and accepting attitudes.
- Questions regarding autism.
- Questions on confusing situations.
- Seeking and sharing support.
- Engagement in our community's values.
- Expressing a difference of opinion without directly insulting another user.
- Please report questionable posts and let the mods deal with it. Chat Room
- We have a chat room! Want to engage in dialogue? Come join us at the community's Matrix Chat.
.
Helpful Resources
- Are you seeking education, support groups, and more? Take a look at our list of helpful resources.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would love to be corrected, but when I looked into it, it sounded like you'd probably want 32gb VRAM or better for actual chat ability. You have to have enough memory to load the model, and anything not handled by your GPU takes a major performance hit. Then, you probably want to aim for a 72 billion parameter model. That's a decently conversational level and maybe close to the one you're using (but it's possible they're higher? I'm just guessing). I think 34B models are comparatively more prone to hallucination and inaccuracy. It sounded like the 32GB VRAM was kinda entry point for the 72B models so I stopped looking, because I can't afford that.
So somebody with more experience or knowledge can hopefully correct me or give a better explanation, but just in case, maybe this is a helpful starting point for someone.
You can download models on huggingface.co and interact with them through a web-ui like this one.