this post was submitted on 16 Jun 2023
2 points (100.0% liked)

datahoarder

6696 readers
4 users here now

Who are we?

We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

We are one. We are legion. And we're trying really hard not to forget.

-- 5-4-3-2-1-bang from this thread

founded 4 years ago
MODERATORS
2
3-2-1 Backup Rule (www.starwindsoftware.com)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

Something i haven't seen posted here yet, but worth say over and over again.

Murphy's law says that anything that can go wrong will go wrong… but with the 3-2-1 strategy in place, your data always survives.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

What’s the best way to make an offsite backup for 42tb at this point with 20mbps of bandwidth? It would take over 6 months to upload while maxing out my connection.

Maybe I could sneakernet an initial backup then incrementally replicate?

[–] [email protected] 1 points 1 year ago

Outside my depth but I'll give it a stab. Identify what data is important, (is the full 42Tb needed?). Can the data be split into easier to handle chunks?

If it is, then I personally do an initial sneakernet to get the fist set of data over. Then mirror different on a regular basis.