this post was submitted on 05 Dec 2023
52 points (100.0% liked)
Chat
7500 readers
21 users here now
Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have the same problem; my flat is only about 50sqm. Judging by the way things are going, I think there’s a chance Nvidia will release some consumer-grade hardware meant for LLMs in the near-ish future. Until they reveal their next lineup, although it may seem like a poor financial decision, I’m just sticking to using the cloud for running llms.
I’m also hoping to get my hands on some raspberry pis too. I would like to build a toy k3s cluster at some point and maybe run my own mastodon instance. :)
Well at least I'm not the only one whose homelab ambitions are being crushed by their apartment layout. I think that I'm going to end up with a 2U compute rack, which means I'll probably limp along on one or two consumer low-profile GPUs. Now if only I could work out the details of the actual rack server hardware...
A Raspberry Pi cluster is interesting! My only real exposure to using Pis in a homelab was an old 1B I was using for PiHole. It was great right up until it stopped working.