this post was submitted on 13 Jun 2023
11 points (100.0% liked)

Free and Open Source Software

17957 readers
165 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am using duplicati and thinking of switching to Borg. What do you use and why?

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago (1 children)

I use restic. For local backups, Timeshift.

[–] [email protected] 1 points 1 year ago

Seconded, I use restic with a remote blob storage and works nicely

[–] [email protected] 6 points 1 year ago (1 children)

Using borg backup, just because there are some nice frontends for the gnome ecosystem (when I am using gnome, I love to use gnome apps), and it has a nice cmd for scripting when using something else (using it on servers)

[–] [email protected] 3 points 1 year ago (1 children)

And there is a nice graphical frontend for it too: Vorta

[–] [email protected] 1 points 1 year ago

Personally more of Pika Backup user ;)

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (2 children)

I don't have backups. :/

And I will regret it some day.

I use github for code so that's backed up though.

[–] [email protected] 6 points 1 year ago

There are two kinds of people.
Those who make backups and those who will.

[–] [email protected] 1 points 1 year ago

You very much will. It's easier than you'd think.

[–] [email protected] 4 points 1 year ago

There is no such thing as the objectively best solution. Each tool has advantages and disadvantages. And every user has different preferences and requirements.

Personally, I am using Borg for years. And I have had to restore data several times, which has worked every time.

In addition to Borg, you can also look at Borgmatic. This wrapper extends the functionality and makes some things easier.

And if you want to use a graphical user interface, you can have a look at Vorta or Pika.

[–] [email protected] 4 points 1 year ago

I just use rsync to backup my home folder to my NAS.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

What problem are you trying to solve? Please think about that, and about your backup strategy, before you decide on any specific tools.

For example, here are several scenarios that I guard against in my backup strategy:

  • Accidentally delete a file, I want to recover it quickly (snapshots);
  • Entire drive goes kablooie, I want my system to continue running without downtime (RAID)
  • User data drive goes kablooie, I want to recover (many many options)
  • Root drive goes kablooie, I want to recover (baremetal recovery tools)
  • House burns down or computer is damaged/stolen (offsite backups)
[–] [email protected] 3 points 1 year ago

I've been using restic. It has built-in dedup & encryption and supports both local and remote storage. I'm using it to back up to a local restic-server (pointing to a USB drive) and Backblaze B2.

Restores for single or small sets of files is easy: restic -r $REPO mount /mnt Then browse through the filesystem view of your snapshots and copy just like any other filesystem.

[–] [email protected] 2 points 1 year ago (2 children)

Kopia has served me great. I back up to my local Ceph S3 storage and then keep a second clone of that on a raid.

Kopiahas good performance and miltiple hosts can back up tp it concurrently while preserving deduplication -- unlike borgbackup.

[–] [email protected] 1 points 1 year ago (1 children)

Kopia has been working great for me as well. It's simple, versatile and reliable. I previously used Duplicati but kept running into jobs failing for no reason, backup configurations missing randomly and simple restores taking hours. It was a hot mess and I'm happy I switched.

[–] [email protected] 3 points 1 year ago (1 children)

I want to love kopia but the command line syntax feels unnatural to me. I don't know why either. For the whole month I test drove it, I had to look up every single time how to do something. Contrast this with restic which is less featureful in some ways but a few days in it felt like I was just using git.

[–] [email protected] 1 points 1 year ago (1 children)

I never used the command line with Kopia besides starting it up in server mode and used the web based GUI to configure, it was pretty simple to get everything setup that way. You may want to give it another try using Kopia in that mode.

[–] [email protected] 1 points 1 year ago (1 children)

My use case is for headless machines which makes it a no go in that regard unfortunately.

[–] [email protected] 3 points 1 year ago (1 children)

You can use the web ui remotely.

Personally I use it from command line, though, and my only complaint is that it's too easy to start a backup you didn't intend to.. Buut if you're careful about usong the kopia snapshot command then it's fine.

[–] [email protected] 1 points 1 year ago

Oh I thought the webui was only for server mode.

I just quickly glanced through the manuals of both restic and kopia. I think my trouble with kopia is that its style feels kind of weird. I'm just not able to wrap my head around it well.

kopia snapshot create /dir is shorter but more confusing than restic -r repo backup /dir

[–] [email protected] 2 points 1 year ago

Just a reminder. Consider and test your restore process as well. Backups without restore testing are kind of questionable. Also think how the restore will go. Do you want to do a bare metal restore, or will you just reinstall, and restore certain things for example. Lot of these backup methods will not get a true bare metal restore set, nor can file system backups be "perfect" if they are done on a running system. Databases and things like cryptfs mounts for example can be problematic for example. Nor do all tools necessarily backup the full structure of the file system.

Not saying these are always issues, just be aware of them.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

I just use a script on an systemd timer. Well two scripts on two timers really - one running daily, one weekly for different data. It's just a bunch of rsync commands copying folders to an hdd in my system and I reroute the output into a simple log file, mainly to verify if it ran at all. I am a bit paranoid about that. I can also run it manually whenever I want. Oh and some of the data I also rsync again to a smb cloud drive from Hetzner. I do not keep multiple versions and I delete remote files that have been deleted locally. It's just a 1:1 copy.
Oh and I use OpenSuse Tumbleweed so I have auto configured btrfs snapshots. Though I have not needed them yet and could not even say how I can use those. I figure that out once I need them.

[–] [email protected] 2 points 1 year ago

For my Ubuntu desktop, I use the builtin backup tool to take backups on my NAS. For my homelab, I have everything running on Proxmox and my Proxmox backup server takes care of the homelab backups.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

I use my own scripts with rsync etc, I don't back up my OS itself since I have installing it automated with scripts as well. I just back up specific things I need with my scripts.

[–] [email protected] 1 points 1 year ago (1 children)

automated with scripts

would you like to share those or do you have references for creating such scripts? this is on my to do list since years but I always struggle where to begin with.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

They're very personalized to my setup, so they're not particularly useful in a general sense - I'd recommend something more like using this guide which seems to be pretty good: https://jumpcloud.com/blog/how-to-use-rsync-remote-backup-linux-system

Learning bash has been great for me, it's helped a ton being able to automate so many different things even just like installing and configuring specific applications to work the way I want, etc

I think a script to manually run for manual backups plus a different script to run for automatic backups scheduled via cronjob is a great way to go.

[–] [email protected] 2 points 1 year ago

Rsync is great but if you want snapshots and file history rsnapshot works pretty well. It's based on rsync but for every sync it creates shortcuts for existing files and only copies changes and new files. It saves space and remains transparent for the user. FreeFileSync is also amazing

[–] [email protected] 2 points 1 year ago (1 children)

I use NixOS so all my system configuration is already saved in my NixOS configs, which I save on GitHub. For dotfiles that aren't managed by NixOS I use syncthing to sync them between my devices, but no real backup cause I can just remake them if I need to, and things like my Neovim and VSCode configs are managed by my NixOS configs so they're backed up as well.

[–] [email protected] 2 points 1 year ago (1 children)

You can take this to the extreme too by erasing your root partition each boot: https://grahamc.com/blog/erase-your-darlings/

Using that method you isolate all important state on the system for backup with zfs send.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

Yeah I have a full impermanence setup using tmpfs, which is really nice. I did it like on the NixOS wiki and it's been helpful for organize my dotfiles and keeping track of all the random stuff that programs put everywhere.

I actually have all my stuff in a separate /stuff folder kinda by accident so my /home only has dotfiles and things like that.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (3 children)

I use btrfs snapshots and btrbk

btrfs is a great filesystem and btrbk complements it easily. Switching between snapshots is also really easy if something goes wrong and you need to restore.

Archwiki docs for btrfs: https://wiki.archlinux.org/title/Btrfs#Incremental_backup_to_external_drive

Of course you'd still want a remote location to backup to. You can use an encrypted volume with cloud storage. So google drive, etc all work.

[–] [email protected] 1 points 1 year ago

This is what I do. Btrfs snapshots and use send/receive with my NAS.

[–] [email protected] 1 points 1 year ago

This is the way !

load more comments (1 replies)
[–] [email protected] 1 points 1 year ago (1 children)

I'm currently working on a disaster recovery plan using fsarchiver. I have very limited experience with it so far, but it had the features and social proof I was looking for.

I have so far used it to create offline filesystem backups of two volumes, one was LUKS encrypted (has to be manually "opened" with cryptsetup).

It can backup live filesystems which was important to me.

It's early days for my experience with this, but I'm sure others have used it and might chime in.

[–] [email protected] 2 points 1 year ago

Just one warning. If doing live, think about state and test your restores. Just mention because things like databases and ecryptfs will not properly archive live. There are various ways around, but consider if you have concerns regarding getting really good complete backups taken at one point in time and on live systems.

[–] [email protected] 1 points 1 year ago

I use FreeFileSync. It's the only GUI tool I found that let's me sync folders while omitting file deletions. It lets you create batch files from the GUI that I execute with crontab multiple times per day.

[–] [email protected] 1 points 1 year ago

I'm currently using TimeShift to backup my desktop onto an external hard drive (the why is because of how simple it is to use) and I'll be making a copy of anything I upload to my jellyfin server onto the external hard drive as well. I hope to eventually have a dedicated backup server and have a duplicate of it at a friend's house for offside backup too

[–] [email protected] 1 points 1 year ago

I work with VMs mostly, so I go for Veeam B&R. The free tier allows you to backup 10 VMs or machines.

load more comments
view more: next ›