this post was submitted on 14 Jul 2023
32 points (97.1% liked)

Selfhosted

40200 readers
522 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

So, this is a rather odd request of a backup solution, but it's kinda what I want right now.

I'm still relatively new to Linux and self-hosting in general

A few years ago, my cousin and I were hosting our own Minecraft server. It had a mod that would create backups of the world folder. It zipped it up, named it "yyyy-mm-dd.zip" and placed it in a backups folder somewhere on the server.

The most important feature that I want is actually the next part. It would allow us to specify how many backups we wanted to keep, and also how frequent we wanted the backup to run.

We set it to backup daily, and keep 14 days of backups. After that, it would delete the oldest one, and make a new backup.

I would like to replicate that functionality! Specify the frequency, but ALSO how many backups to keep.

Idk if it's asking too much. I've tried doing some research, but I'm not sure where to start.

Ideally I'd like something I can host on docker. Maybe connect to a Google account or something so it can be off-site.

I only want to use it for docker config files, compose files, container folders, etc.

I've looked into restic, but it seems it encrypts the backups, and you NEED a working copy of restic to restore? I'd like something simple like a .zip file instead or something, to be able to just download, unzip, and spin up the compose file and stuff.

Sorry for the wall of text, thanks in advance if you have any suggestions!

P.S. I'm pretty sure the upload to Google or some other service would have to be a separate program, so I'm looking into that as well.

Update: I want to thank everyone for your wonderful suggestions. As of right now, I have settled on a docker container of Duplicati, backed up to my Mega.nz account. Last I checked they lowered the storage limit, but I was lucky to snag an account when they were offing 50GB free when you joined, so it's working out well so far. I did have to abandon my original idea, and decided to look for something with deduplication (now that I know what it is!) And encryption.

top 30 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago (2 children)

I do something similar with rclone. Most server software have some way of creating backups. Have that software create a backup and use rclone to move the file over to some cloud storage. Rclone also has the option to delete older stuff (rclone delete --min-age 7d). Do all that with a shell script and add it to the crontab.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

I think this is the best solution. rclone also has built in crypt too.

Edit: built in crypt if you configure it for use

[–] [email protected] 2 points 1 year ago (1 children)

That sounds like the 2nd part of what I want! The uploading to off-site part! Awesome, I'll def look into it, thank you!

[–] [email protected] 3 points 1 year ago (1 children)

If you look at my recent post history I gave out my script using rclone to backup my server. It's in NixOS but you can ignore it as it is bash scripting at its core. It has everything you need like using rclone to delete older backups.

[–] [email protected] 2 points 1 year ago

Just saved your comment for reference later, thank you so much!

[–] [email protected] 6 points 1 year ago (1 children)

You should be able to achieve this with Kopia

[–] [email protected] 4 points 1 year ago (1 children)

I'm trying this out right now with Kopia for docker, and I'm not the biggest fan of (seemingly) not being able to turn off the obfuscation, and making it do just a single .zip file or .tar or whatever. Also, having a hard time setting up drive integration with the GUI, but that's just my fault. I'm not familiar with rclone or Kopia at all.

[–] [email protected] 2 points 1 year ago

You can mount the complete backup as a local file system, which I think would suit your needs. I’m not familiar with their various integrations either, I just backup over SFTP.

But to reassure you, I also needed a bit of trial and error with Kopia, as it’s not the easiest GUI ever to get used to. But I’ve got it running now, and I’m very happy with it. I’ve also used it to successfully restore multiple backups (to test if it worked) and they all worked.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

Borg Backup works great for that, exactly it's use case. It's a command line thing, but you can use Vorta as a UI if you want that. If you have a NAS, it can back up directly to that.

I have a second cronjob in my setup that syncs the encrypted archive to B2 nightly. Works great

[–] [email protected] 3 points 1 year ago

But... The Vorta were the spokespeople for the Dominion. Why would they be working for the Borg?? Why would you do this to me???

[–] [email protected] 5 points 1 year ago (1 children)

What you want is a bash script and a cron job that calls it. Most of what you need is likely already installed for you.

"crontab -e" will pull up your crontab editor. Check out "man crontab" first to get an idea of the syntax...it looks complicated at first but it's actually really easy once you get the hang of it.

Your script will call tar to create your backup archive. You'll need the path to the folder where your files to backup are and then something like: tar -C PATH_TO_FILES -czf PATH_AND_NAME_OF_BACKUP.tgz .

That last dot tells it to tar up everything in the current folder. You can also use backticks to call programs in line....like date (man date). So if your server software lives in /opt/server and your config files you want to backup are in /opt/server/conf and you want to store the backups in /home/backups you could do something like:

tar -C /opt/server/conf -czf /home/backups/server_bkup.`date +%Y%m%d`.tgz .

Which would call tar, tell it to change directory (-C) to /opt/server/conf and then create (-c) +gzip (-z) into file (-f) /home/backups/blah.tgz everything in the new current directory (.)

I don't know if that's what you're looking for but that would be the easiest way to do it...sorry for potato formatting but I'm on mobile

[–] [email protected] 2 points 1 year ago

No honestly, this was very helpful!

This, in combination with the solutions some others have suggested here already, would be pretty much what I want, just in multiple different parts, instead of 1 program/utility.

I'll def look into this, and honestly see if I can find a docker image for something like this as well!!

Thank you so much!!!

[–] [email protected] 5 points 1 year ago (1 children)

Duplicati does this and it's one of the best backup solutions imo

[–] [email protected] 3 points 1 year ago

Awesome, I'll add it to the list of software to look into! Actually, if it does everything, then it's gonna be the first 1 I try! Thank you!

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (2 children)

I would say since you want simple .zip archives, this could be something to script yourself since it would be fairly easy.

Basically:

  • Zip the server into a dated zip file
  • Check for old zip files and delete
  • Upload zip files using rclone to remote storage (gdrive, etc)
  • Optionally send a notification to discord, telegram, healthchecks.io, or something like that

The downside of zipping backups like this is obviously storage space, every backup takes up the full amount of space of the server, since there's no deduplication or incremental versioning happening.

[–] [email protected] 4 points 1 year ago

@MangoPenguin
If you're scripting it yourself, https://www.complete.org/dar/ gives a few extra niceties over just zip files or tarballs.
Thank @jgoerzen for the nice summary.
@koinu

[–] [email protected] 2 points 1 year ago

I think this might be the way I have to go!

I'm really liking Kopia. Nice GUI and some pretty nice settings. But I don't like the obfuscation. Like you said, I just want the zip files. I think I'll try Borgbackup, then rclone to drive, but I'll also look into just scripting it myself!

Thank you so much :)

[–] [email protected] 4 points 1 year ago (1 children)

Sounds like a job for logrotate. It does more than just log files, kinda average name I guess. Checkout this server fault q&a for more details. https://serverfault.com/questions/196843/logrotate-rotating-non-log-files

[–] [email protected] 3 points 1 year ago (1 children)

I'll have to look more into this, because I think I misunderstood, but it seems that it is ½ of the backup solution right? It won't actually MAKE the backups, but it'll allow me to "rotate" and only keep the last "x" files?

[–] [email protected] 2 points 1 year ago (1 children)

Yep that's the one. If you can make a cron job to make the zip file, logrotate could handle keeping the last x files.

It might sound complicated, but the cool thing about *nix environments is that everything is made up of a combo of little tools. You can learn one at a time and slowly build something super complicated over time. First thing would be figuring out the right set of commands to make a zip file from the directory I reckon. Then add that to cron so it happens every day. Then add logrotate into the mix and have that do its thing every day after the backup runs.

[–] [email protected] 2 points 1 year ago

So I think I'll try Duplicati for docker next, and if that fails, then I'll try scripting and cronjobs.

I'm so happy with all the support, thank you! :)

[–] [email protected] 2 points 1 year ago (1 children)

You can look at backuppc, it has served us well for years now. Offsite, manages incremental and full back ups, file deduplication, etc.
So on your Minecraft server do a daily backup and add the day off the week to it (whatever.7.gz), this way you always have 7 backups on the server and it auto rotates. Add that for folder to backuppc and the backup server will automatically decrease the amount of backups if they get older.

[–] [email protected] 1 points 1 year ago

Thank you so much for your suggestion, imma add it to my list :)

[–] [email protected] 2 points 1 year ago
[–] [email protected] 1 points 1 year ago (1 children)

@[email protected] I suspect you can force restic not to encrypt. The other additional advantage of restic and similars is that you can specify s3, sftp and other targets.

[–] [email protected] 1 points 1 year ago (1 children)

I'll keep digging into it, and probably spin up a container to fully test it out myself.

Thank you!

[–] [email protected] 1 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago (1 children)

Dang. I'm still gonna look into it though. The hardest part was getting the names of different software. I kept finding different ways to do it in CLI, but no docker software or anything.

[–] [email protected] 2 points 1 year ago (1 children)

As someone already said, Duplicati. You can install it using docker and it has super easy web gui

[–] [email protected] 1 points 1 year ago

This is what I'm going to try next, as I'm not completely happy with Kopia

load more comments
view more: next ›