this post was submitted on 08 May 2024
18 points (100.0% liked)

Permacomputing

647 readers
2 users here now

Computing to support life on Earth

Computing in the age of climate crisis is often wasteful and adds nothing useful to our real life communities. Here we try to find out how to change that.

Definition and purpose of permacomputing: http://viznut.fi/files/texts-en/permacomputing.html

XMPP chat: https://movim.slrpnk.net/chat/lowtech%40chat.disroot.org/room

Sister community over at lemmy.sdf.org: [email protected]

There's also a wiki: https://permacomputing.net/

Website: http://wiki.xxiivv.com/site/permacomputing.html

founded 1 year ago
MODERATORS
 

Hi all, I'm really looking for some help. I need to create a reliable system of backing up and data storage. I'm not tech-savvy (will work on that when it's a priority in my life, which it definitely can't be right now) and I'm asking this community because it's forward-thinking and aligns with my values. There are things I have right now, on paper and digitally, that I want to be able to retrieve at least a decade from now (and we'll check in on how the situation changes and what's worth keeping or printing out etc then). Most of the stuff bouncing about in my brain is the conventional advice:

  1. The age-old "at least three places"
  2. Don't store what I don't strictly need
  3. Accessible & simple: the less I have to fiddle, the more sustainable it is (kind of seems to conflict with 1)
  4. Privacy-first, don't trust clouds, etc (kind of sems to conflict with 1, too!)

I'm not sure (a) if there are any other principles to keep in mind while designing a system that works for me or (b) how this might translate into practical advice about hardware or software solutions. If anything has or hasn't worked for you personally, please share. My daily driver is a LineageOS tablet and it's not clear to me how to best keep its data safe.

top 3 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 6 months ago

I'm not a fan of backups. They are a special path that is orthogonal to how you use computers, meaning it's additional time and energy you need just for finding relevant hosts, doing the copies regularly and most of all *actually test that the copy went well* (ie test the backup) which gets more and more irportant the longer your system is in place.

I opted for a different strategy: I have a folder for my photos and another folder for my "Documents" (at large). They both exist on my computer and on my phone and are synchronised with syncthing. I also have extra copies on other servers, one of which keeps old versions but I have never had the actual use for it, which is good because I have never checked it works correctly.

Compared to a backup I have the thing that works seamlessly in the background (I don't fiddle with some shell scripts that fail because I put single quotes instead of double quotes), I actually test the oopy works because I use files on two different devices, and the fact that everything is bluntly copied means I am forced to think "is it worth keeping". I aim to keep my folders under 50GB combined, which is a lot for a phone but nothing in the grand scheme of thing. Most of that is actually videos I pre-download to watch them online while on the move but that's another thing.

Syncthing means I can trivially add new devices as life goes on and old ones die

@permacomputing

[–] [email protected] 3 points 6 months ago

have 2 flash/hard/whatever drives: A and B

once a month (or at what ever frequency you can sustain)

backup your data to A and next cycle backup your data to B

nothing fancy or technical, just some basic consistent backups.

if you can do that you'll likely be fine. There are nearly infinitely many enhancements you can do if you are more technical or can follow technical instructions.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago)

My typical backup system:

  • a computer that needs backing up in on a network
  • the computer hosting backups is on that network
  • [optional] the backup host is either powered down, or has brought its network connection down (so it's not visible and not hackable)
  • at a predetermined time, the backup host wakes (brings its network connection up)
  • it checks if the backup source is present, aborting if not
  • it logs in via SFTP (an FTP-like protocol built on top of SSH) with public key authentication and pulls the backups down from the source according to a script (SFTP can do a list of tasks basing on a command script)
  • after successful download (but not after a failure to download) it searches for too old backups and erases them ¹
  • finally the backup host powers down or leaves the network
  • optional final step: occasionally, a disk image of the backup host is taken, the memory card is put in a bottle, the cap is screwed on tight and the bottle is hidden under a stone :)

Regarding data protection: ideally, both computers use disk encryption. Especially the backup host, since it's unattended and could be taken by a burglar (or a cop), and holds the private key that can access the backup source.

¹ erasing old stuff is easy enough in Linux/Bash:

for i in `seq 5 10`;
do
     DATE=$(date --date "$i days ago" +%Y%m%d)
     echo "Deleting backups from $i days ago, that is [$DATE]."
     # do something
done

...generates a sequence of past dates ranging from 5..10 days in the past, attempts to delete something for each. Or alternatively, for those who like fancier, shorter and a bit more risky commands...

find ${BACKUP_DIR}/backup*.tgz -mtime +10 -exec rm {} \;

...finds files in directory $BACKUP_DIR named "backup*.tgz" and if modification time is older than 10 days, passes them as arguments to "rm".