this post was submitted on 25 Feb 2024
87 points (97.8% liked)

Selfhosted

40008 readers
699 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

Hello internet users. I have tried gpt4all and like it, but it is very slow on my laptop. I was wondering if anyone here knows of any solutions I could run on my server (debian 12, amd cpu, intel a380 gpu) through a web interface. Has anyone found any good way to do this?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 8 months ago

Thanks to this post, and the other comments in here, I've discovered that the ultimate ui for ai-models may well be

https://github.com/ParisNeo/lollms-webui

and on HuggingFace ( that name is aweful: to me it is the creepy-horrible FaceHugger, from the movie Alien, that I saw so many decades ago ) TheBloke has some models which are smaller

https://huggingface.co/TheBloke/

so you can choose a model that will actually-work on your hardware.

I think Llama-2 for brainstorming & CodeLlama-instruct for learning programming examples seems to be the cleanest pair, from what I've read, and he's got GGUF versions with different quantizations, so you can choose what will actually-fit on your hardware.

There are other models on huggingface which seem very useful, like

  • whisper-large-v3 for speech-to-text,
  • whisperspeech for text-to-speech,
  • sdxl-turbo for image-making ( for some copyright-free subjects to practice drawing with ), and so-on..

Some models require GPU, not all.

Damn things moved fast!