Constant work in progress.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
I use immutable nixos installs. Everything to redeploy my OS is tracked in git including most app configurations. The one exception are some GUI apps I'd have to do manually on reinstall.
I have a persistence volume for things like:
- Rollbacks
- Personal files
- Git repos
- Logs
- Caches / Games
I have 30 days (or last 5 minimum) of system rollbacks using BTRFS volumes.
The personal files are backed up hourly to a local server which then backs up nightly to B2 Backblaze using rclone in an encrypted volume using my private keys. The local server has a mishmash of drives in a mirrored LVM setup. While it works well for having mixed drives, I'll warn I haven't had a drive failure yet so I'm not sure the difficulty of replacing a drive.
My phone uses the same flow with RoundSync (rclone + GUI).
Git repos are backed up in git.
Logs aren't backed up. I just persist them for debugging and don't want them lost after every reboot.
Caches/Games are persisted but not backed up. Nixos uses symlinks and BTRFS to be immutable. That paradigm doesn't work well for this case. The one exception is a couple game folders are part of my personal files. WoW plugin folder, EvE online layouts, etc.
I used to use Dropbox (with rclone to encrypt). It was $20/mo for 2Tb. It is cheaper on paper. I don't backup nearly that much. Backblaze started at $1/mo for what I use. I'm now up to $2/mo. It will be a few years before I need to clean up my backups for cost reasons.
The local server is a PC in a case with 8 drive bays plus some NVME drives for fast storage. It has a couple older drives and for the last couple years I typically buy a pair of drives on sale (black Friday, prime day, etc). I have a little over 30TB mirrored, so slightly over 60TB in total. NVME is not counted in that. One NVME is for the system, the others are a caching layer (monero node) or temp storage (transcoding as it also my media server).
I like the case, but if I were to do it again, I'd probably get a rack mountable case.
You seem pretty organized in your strategy, I would suggest you just pull a drive in your LVM to check how that goes for you. I've had issues in JBOD style LVM volumes with drive swaps, but YMMV.
Frankly, I use ZFS now in anything that I would have use LVM in before. The feature set is way more robust. Also, an offsite ZFS replication to zfs.rent is a good backup of a backup. But Backblaze is pretty solid too.
Good call on a simulated failure. When I first set it up, it was LVM/BTRFS or ZFS as my top choices. It was a coin toss at the time because I hadn't built this sort of setup before.
Yah, an untested raid is like an untested backup: suspect.
Here's one that probably nobody else here is doing. The backup goes on my mobile device. Yes, the thing in my pocket.
- Mount it over SSHFS on the local network
- Unlock a LUKS container in the form of a 30GB sparse file on the device
rsync
the files across- Lock, unmount
The backup is incremental but the container file never changes size, no matter what's in it. Your data is in two places and always under your physical control. But the key is never stored on the remote device, so you could also do this with a VPS.
Highly recommended.
Where is the key stored?
Locally.
If your local machine dies, and you have a backup on your phone which you cannot unlock... aren't you screwed?
Good question. No, but at a small cost in security. The key I generated using sha512sum
using a very solid memorized passphrase. This means I can regenerate the key in the scenario you describe.
etckeeper, and borg/vorta for /home
I try to be good about everything being installed in packages, even if Im the one that made the package. that means I only have to worry about backing up my local package archive. but Ive never actualy recreated a personal system from a backup, and usually end up starting from a fresh install, slowly adding back things from the backup if I missed them. this tends to cut down on cruft and no longer needed hacks and fixes. also makes for a good way to be exposed to new paradigms (desktop environments, shells, etc)
something that helps is daily notes. one file for any day Im working on my system and want to remember what a custom file, confg edit, or downloaded/created package does and why. these get saved separately and I try to remember to grep them before asking the internet
i see the benefit to snapshots, but disk space is expensive, and Im (usually) careful (enough) not to lock myself out or prevent boots. anything catastophic I have to fix is usually seen as a fun, stressful learning experience! that rarely happens anymore, for better or for worse
I use Bluebuild to create a reproducible system, plus a post-install script to handle other post-install tasks such as setting up initial preferences.
Also Vorta to backup files and settings to external HD, plus OneDrive Linux client to sync files and settings to cloud.
I use Duplicity to backup my home directory, excluding Steam and Downloads folders. It is setup to backup weekly to my NAS mounted as NFS. The NAS has a weekly cron task to upload the backups to pCloud using rclone. I backup this way, several computers (2 desktop, 2 laptop, the NAS as well). The files included in this strategy are essentially my photos, documents and configs. My software installations, games, media library are not backed up.
Nightly rsync to two NAS boxes in the house (TrueNAS Scale and a Synology). Docs go in NextCloud, hosted on a VM in my basement, which is also backed up to the Synology by Proxmox. Also backing up my main machine (Pop!_OS) and my wife’s laptop (ThinkPad E595, also Pop!_OS) using Spideroak One.
Daily rsync to a local nas and weekly backups to offsite with pika-backup.
Internal RAID1 as first line of defense. Rsync to external drives where at least one is always offsite as second. Rclone to cloud storage for my most important data as the third.
Backups 2 and 3 are manual but I have reminders set and do it about once a month. I don't accrue much new data that I can't easily replace so that's fine for me.
321
Kopia backup to secondary HDD
- Pictures (phone photos backed up to my server via immich)
- workspace (git repos, ECAD, MCAD, firmware, etc...)
- qmk layout
- Documents
- vim folder with bundles
- ebooks
KDE vaults stores on secondary HDD
Soon I will set up kopia to also back up every via SSH to my server and then small size essentials and important docs via google drive
I need to set server cloud backups too, but haven't had the time...
I leverage btrfs or ZFS snapshots. I take rolling system level snapshots on a schedule (daily, weekly, monthly and separately before any package upgrades or installs) and user data snapshots every couple of hours. Then I use btrbk to sync those snapshots to an external drive at least once a week. When I have all of my networking gear and home services setup I also sync all of this to storage on my NAS. Any hosts on the network keep rolling snapshots stored on the NAS as well.
Important data also gets shoveled into a B2 bucket and/or Google drive if I need to be able to access it from a phone.
I keep snapshots small by splitting data up into well defined subvolumes, anything that can be reacquired from the cloud (downloads, package caches, steam libraries, movies, music, etc) isn't included in the backup strategy. If I download something and it's hard to find or important I move it out of downloads and into a location that is covered by my backups.
Currently I use Borg Backup with Vorta as a GUI. I don't really do anything automated/scheduled, I just back it up manually to an external SSD every few days or so. I pretty much do my whole /home
folder, except for a couple of subfolders that aren't really necessary (and Videos
, which I back up separately.)
I do eventually want to upgrade to a NAS, but I'm waiting until we move to start setting that up. Also I don't really have an off-site plan yet which I know is bad, but I need to figure that out.
The important stuff is in cloud storage using Cryptomator (I'm hoping that rclone should make sync simple), I should probably set up time shift in case things do go wrong
Not only because third world issues, but because I like adrenaline, I don't have any backup strategy but an old external HDD where I haven't copied stuff since 2018.
When I could afford a new PC and tried to rsync my data from my old crappy laptop, much of it was lost.
That being said, I had a backup strategy back in the day that was burning CDs. I used to have a second HDD (a IDE one) but they were so freaking bad all of them went bad after a year or so, so I have like 3 or 4 of them stored without any chance to recover their data.
After having recently restored some stuff from an aging external hdd, i'm seriously considering getting a few dvdr discs and burning the important things every now and then.
I know they don't last forever either, but - just as a random example that has definitely never happened to me hahaha - you can drop them from a height of 3 feet and still get files off them!
Keep everything on Nextcloud and back that up via Proxmox Backup Server.
Nuke and pave takes me less time to reconfigure Plasma and install NC client than bothering to back anything up directly.
When I researched this previously I concluded that there are two very good options for regular backups: Borg and Restic. These are especially efficient at backing up a diff of what has changed since the last backup. So you get snapshots of your filesystem state at each backup point without using a huge amount of space. You can mount any snapshot as a virtual directory. After the initial backup, incremental backups take a minute or two.
I use Borg, and I back up to cloud storage on Borgbase. I use Vorta as a GUI for Borg. I have Vorta start automatically when I start my window manager, and I have it set up for daily backups. I set up the same thing on my kid's computer.
I back up my home directory. I have some excluded directories like ~/.cache
, and Steam's data directory. I use Baobab to find large directories that I don't want backed up.
I use the "exclude caches" option in the Borg "create archive" settings. That automatically excludes Rust target/
directories because they follow the Cache Directory Tagging Specification. Not all programming languages' tooling follows that spec so I also use directory name pattern excludes. For example I have an exclude pattern for .*/node_modules/.*
I use NixOS, and I keep my system config in a git repo so I don't need backups for anything outside my home directory.
For my home server, I use Restic and a cronjob to weekly take snapshots of all my services. It then gets synced to a Backblaze B2 bucket (at $6/TB/mo). It's pretty neat, only saving the difference between the previous and current snapshot, removes older snapshots, and encrypts everything.
All of my servers make local dumps of their databases and config files to directories owned by unprivileged users. This includes file paths, permissions, and ownerships (so I know how to put them back).
My primary research server at home uses rsync to pull copies of those local backups from my servers.
My primary research server uses Restic to make a daily incremental backup to Backblaze's B2 service.
Dotfiles are handled by GNU Stow and git. I have this on all my devices.
Projects like in git.
Media is periodically rsynced from my server to an external drive.
Been meaning to put all my docker-composes into git as well...
I don't back up too much else.
.dotfiles on github
Big/critical files on an external HD
simple as
I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.
Timeshift for the system, works perfectly, if you screw up the system, bad update for instance just start it, and you'll be back up running in less than ten minutes. Simple Cron backups for data, documents etc, just in case you delete a folder, document, image etc . Both of these options to a second internal HD
Software & Services:
- Restic client
- Restic REST server
- https://github.com/rbuchberger/res-man + systemd timers or cron to configure & run restic nightly
- healthchecks.io for monitoring
- ntfy.sh for notifications
Destinations:
- Local raspberry pi with external hdd, running restic REST server
- RAID 1 NAS at parents' house, connected via tailscale, also running restic REST
I've been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven't gotten to it.
Edit: For the backup set I back up pretty much everything. I'm not paying per gig, though.
Pendrive for the important stuff, paper for the really important stuff and brain for everything else.
I have my important folders synced to my Nextcloud and create nightly snapshots of that to a different drive using borg.
One thing I still need to do, is offsite encrypted backups using rsync.
Timeshift for configs to a locally attached drive. Home partition to cloud with rsync
Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.
In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.
I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.
I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.
I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P
For system files/configuration on my machines, timeshift set to run once a week.
For family photos and shared files, I built a pair of SFTP servers made from old HP thin-client PCs at two different geographic locations which automatically sync to each other once a day via cron job using vsftpd and lftp. Each one has both an NVMe and SATA SSD which run in a software RAID 1 configuration.
For any other files, a second local server also using vsftpd and two SSDs in USB enclosures. I manually back them up using rsync on an irregular basis.
The glorious life of openSUSE, defaults to btrfs on install and everything is preconfigured with snapper out of the box. Easy life, nothing to worry about.
Yea, that's snapshotting, what do you do about back ups?