this post was submitted on 31 Mar 2024
0 points (NaN% liked)

Selfhosted

40113 readers
949 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I've just discovered OmniGPT that seems to be a chat where you can interact with different LLM (Claude, GPT-4, Llama, Gemini, etc.) and costs $16/month (it was $7/month until a week ago 🤦‍♂️). I've read on a Reddit post that it uses the APIs of all the provider that is a thing that can be done for free using a personal account (since the API limit seems to be high). Do you know something like OminGPT that can be self hosted that uses users API keys?

top 8 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)
[–] [email protected] 0 points 7 months ago (1 children)

That seems to need the AI model to be local

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

Oh yes, maybe I misunderstood what you were asking. This is the server that will host the models and the API, it also has a nice interface.

So by local I mean local to the server, you can run it somewhere else and not put the models on your local computer, but yes the server will need them.

You can then use other apps to connect with it. That's what I consider self hosting, hosting the whole thing soup to nuts

[–] [email protected] 0 points 7 months ago (1 children)

What I'm looking for is a frontend that uses GTP-4, Gemini and other AI engine with their respective APIs keys.

[–] [email protected] 0 points 7 months ago (1 children)

Yeah, using “self hosted” in your title is misleading.

[–] [email protected] -1 points 7 months ago* (last edited 7 months ago) (1 children)

But I will...self host this service! And beside the title, I've written a post with a description of what I'm looking for.

[–] [email protected] 0 points 7 months ago (1 children)

you want a frontend, not the "service" itself.
Under "service" i usually understand the main logic part of something. In this case the LLM-processing itself.
Thats probably where the confusion is coming from here.

[–] [email protected] -1 points 7 months ago

It's still self hosted! 🤷🏻‍♂️