It's A Digital Disease!

11 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
126
 
 
The original post: /r/datahoarder by /u/dannydevitoloveme on 2024-12-31 20:24:39.

My dilemma: I have almost 2tb of pictures and videos on my google drive & am running out of space. Have been intending to back them up in other places/externally for a while to free up space and protect them in case the worst happens, but am honestly unsure of the best route. Probably 800gb+ of videos and the rest are pictures.

I have deep dove on this subreddit but I am not educated on this stuff at ALLLL so I’m not understanding a lot of terminology. I’ve seen the 3-2-1 rule but its not super clear to me. I’ve been considering an external hard drive (?) but I don’t know if that will do what I want. Would it make sense to use something different like dropbox in addition to google? I’m also slightly broke so more monthly subscriptions to things is a bit outta question

Sorry if these are dumb questions lol i just don’t wanna lose my data & have about 400gb of space left

127
 
 
The original post: /r/datahoarder by /u/rozza591 on 2024-12-31 15:55:48.

What setup do you use for your LTO?

Hey fellow hoarders I'm looking for some friendly advice.

For the last 3 years Id been using the following setup which worked great.

  • Quantum lto6 drive model B
  • Areca ARC-1350 HBA
  • Windows 10
  • Quantum LTFS driver

About a year ago I had to wipe my PC and I could not get the LTO drive to work since.

I'm wondering what y'all use, I know the drive, HBA and cables are working as I can load and unload with tar but I'd rather be using LTFS. I've tried Fedora, Ubuntu and Unraid but I can't seem to get it running on any of them.

Any advice or insight into your own working setup would be appreciated.

128
 
 
The original post: /r/datahoarder by /u/Exotic_Emergency642 on 2024-12-31 14:43:52.

I have had a 3 disk RAID array (internal system board controller) in use for about 4 years now. System software issues forced me to wipe the entire system recently and, in the process, I added an extra drive to the RAID array.

All three disks that have been in use ~4 years (power off / spin down disabled)

SMART:

~40,000 hours of operation

~100 power cycles

A drive I had used with an Xbox that I no longer needed

SMART:

~15,000 hours of operation

~100,000 power cycles

I should mention that these drives have all been shucked from portable housings as this is the easiest way to procure large ones around here.

This makes me curious as to around how many power cycles indicates that a drive is on life support?

I've had wonderful results from drives with low cycle counts and being able to reach much higher hour counts. In fact, all my internal drives have always been set to disable the idle spin down power down process I fear is a drive killer (just a gut instinct, I have no data to back this up). To this day I haven't seen a failure with an internal drive since I stopped buying Maxtor drives 20 years or so ago. Bunches of portables gone bad (they do take more heat and vibration and many time the WD controller board just fails instead of the drive)

No, I promise I didn't play the Xbox for 1 and a half years lol, the darn machine must have been waking the drive multiple times a day when the system was sleeping!

129
 
 
The original post: /r/datahoarder by /u/ramakrishnasurathu on 2025-01-01 03:20:50.

Data storage and usage keep growing every day, but has the environmental impact been factored into this growth? Could we rethink how data centers are designed, powered, and managed to not just be efficient, but also aligned with sustainability goals? What would that kind of future look like?

130
 
 
The original post: /r/datahoarder by /u/randopop21 on 2025-01-01 01:27:15.

Getting old. Slowing down and/or getting heard of hearing. Need subtitles to fully understand dialog.

How do I ensure that the movies I've searched for contain the subtitles?

Sometimes they are in a separate .srt file. But sometimes they are inside the MKV file. And when it comes to MKV files, it's not clear if they have subs or not.

And, sadly, most of the ones I come across don't have any subtitles and I have to search for them separately.

131
 
 
The original post: /r/datahoarder by /u/Deadboy90 on 2025-01-01 00:04:16.
132
 
 
The original post: /r/datahoarder by /u/bob-bulldog-briscoe on 2024-12-31 23:16:26.

I'm looking for some high-quality, archival-grade BD-R discs. I'm all set on CD-R and DVD-R as I have a bunch of old-stock TY on hand from years ago, as well as some Verbatim M-Disc DVD-R. I was going to buy some Verbatim M-Disc BD-R but I read a thread that it's just organic dye now and that it's "M-Disc" in name only. What are some good alternatives? Thanks for the advice!

133
 
 
The original post: /r/datahoarder by /u/Extension_Cat6683 on 2024-12-31 21:43:33.

I come across a lot of highly rated sellers on several market places for used or new products in Germany and Netherlands for example. They sell WD UltraStar/ Seagate Iron wolfs 4-20 TB capacities for literally half the price of a new lne..

Most of the sellers have 500-1000 5star reviews..proper businesses...VAT invoices..they also give 2-3yr warranty on these HDDs and clearly claim 0Hr/0TB on these drives...they say these are overstock from OEMs....

But I don't buy this reasoning...

So my question is.... what's the catch... perhaps you guys know.

Rejects from OEMs, datacenters, old HDD over written with are FW, leaking Helium seals?

134
 
 
The original post: /r/datahoarder by /u/fdjadjgowjoejow on 2024-12-31 16:40:24.

Anyone have a recommendation for a keychain flash drive? It will be BitLocker'ed if that is relevant. 32GB is enough. The only thorough post I found was from 6 years ago

https://old.reddit.com/r/DataHoarder/comments/9azer0/fastlargecompact_keychain_friendly_usbs/

I checked Amazon as well. TIA.

https://www.amazon.com/s?k=keychain+flash+drive

135
 
 
The original post: /r/datahoarder by /u/two_letters on 2024-12-31 16:18:59.

I’m a photographer with about 20-24TB so far. Every year I add 5-6TB.

I currently have a full 16TB 4bay Pegasus in RAID 5 that is backed up to a 16TB OWC Mercury and Backblaze. I also have another 12TB Mercury for cold storage that’s running low. The Mercurys are USB 3 and sloooow.

I’m thinking about getting the 32 or 40TB thunderbolt Gemini to consolidate everything and speed things up. The extra ports are helpful too.

It would back up the RAID via CCC and allow for extra storage. The Mercurys would both be cold storage. Of course everything would be backed up to Backblaze.

Now here’s the dumb questions. If I put the 32TB Gemini in Independent mode, would it mount 2 16TB drives separately on my Mac screen? Or one drive that is 32TB? Can I daisy chain the Pegasus to the Gemini or is that a bad idea?

Any other suggestions to get more space and speed? I edit on an SSD so that’s not an issue.

136
 
 
The original post: /r/datahoarder by /u/kagein12 on 2024-12-31 13:47:33.

I’m in the process of moving all my data off iCloud and onto local storage, and I need a way to store everything reliably for 40+ years. I’ve got over a terabyte of photos and videos of my kids that I really want them to be able to access in the future. I was originally planning on using 100 GB BDXL discs, but since they need specialized drives, I’m worried those drives won’t be easily available down the road, which might make the data impossible to read. Meanwhile, regular 50 GB BD-R discs can be read by any standard Blu-ray player, and I figure those will still be kicking around decades from now.

So, is there a better way to “cold store” my data with some future-proofing, especially since my storage needs are just going to keep growing? Any advice would be appreciated.

*edit*

I am also considering the possibility (morbidly) that i might drop dead at any moment so a certain level of set and forget i feel is necessary.

137
 
 
The original post: /r/datahoarder by /u/WarmGeogre69 on 2024-12-31 20:54:31.

Working on a comic book collection, and comic book files are basically .zip/rar files with jpg's inside. I'm looking for a tool, that can automatically take images out of the archives and compress them. It takes a lot of time, extracting, compressing and archiving each book individually.

138
 
 
The original post: /r/datahoarder by /u/whosenose on 2024-12-31 20:01:12.

I’m buying two new 16 TB drives for my Synology server, and I’d like to test them for errors before adding them to my RAID volume. All I have available is the Synology, a Windows and a Mac laptop, a miniPC Ubuntu server (no monitor) and various Raspberry Pis including 5s running Pi Debian (and no desktops).

I’m comfortable with Linux, MacOS and Windows. What would be my best option to pre-test these drives before adding them? Would I need to buy an adapter to connect them? How long might this take?

139
 
 
The original post: /r/datahoarder by /u/Professional-Rock-51 on 2024-12-31 19:42:52.

I have a Windows Storage Spaces mirror (two drives) formatted as NTFS on my desktop with all onboard SATA controllers used. I would like to replace these drives and then selectively copy data from one of them attached to an external SATA->USB adapter.

What should I know before attempting this? I know that when it's a single drive that attaching it by USB "just works" but I've never tried doing this with a drive that was used on a Storage Spaces mirror. Is there a special process that is required to import the disk so that it can been seen by the system or will it detect and work automatically? I want to figure out a process before I actually disconnect either of these disks to minimize any problems.

140
 
 
The original post: /r/datahoarder by /u/Cloudssj43 on 2024-12-31 19:03:47.

Operating system:

Ubuntu Server 24.04.1 LTS (Running it as a NAS/homeserver)

Disk Setup:

/ & /home -> 500 GB SSD

/mnt/hdd1 -> 12 TB ext4 partition

/mnt/hdd2 -> 12 TB ext4 partition

/mnt/hdd3 -> 12 TB ext4 partition

/mnt/storage -> MergerFS of hdd1 and hdd2

Software

Snapraid of hdd1 and hdd2 with hdd3 as the parity drive

rsnapshot of some folders in /mnt/storage stored on hdd3

Samba running that gives access to /mnt/storage

Jellyfin/QBittorrent/Sonarr running as docker containers that has /mnt/storage mounted

hd-idle configured to sleep drives after 30min of inactivity (Testing shows drives spin up on avg 3-4 times in 1 day)

Status:

Everytime I navigate the NAS filestystem in Windows, both hdd1 and hdd2 spin up just to read the filesystem metadata

Everytime I open Jellyfin's homepage, it tries to access all the thumbnail .jpg files.

/mnt/hdd1 (Which currently stores all my Qbittorrent data) is constantly spinning since I'm seeding some stuff.

Question:

I have a 2TB SSD that I have the option of putting into my NAS and the ideal scenario would be it being used as a cache such that the most commonly used items get stored onto it so that filesystem access is limited to only the SSD. Is there a way to get such a configuration?

I looked up lv cache (but i think it requires me to have my hdds in a LVM group which i have not?) as well as bcache, but I'm a little out of my depth trying to understand how they work. Any advice?

Edit: I'm fine if when I stream my shows/movies from Jellyfin I spin up a drive, I mostly want the Samba/Qbittorrent/Jellyfin thumbnails to not spin up my drives.

141
 
 
The original post: /r/datahoarder by /u/idkwhattowritehere on 2024-12-31 18:39:14.

I consider Scribd's way of functioning not morally correct, so I tried to repair that.

If you want to get rid of that annoying blur, just download this extension. (DESKTOP ONLY, CHROMIUM-BASED BROWSER)

Scribd4free — Bye bye paywall on Scribd :D

142
 
 
The original post: /r/datahoarder by /u/Gnomefort on 2024-12-31 18:03:04.

I am experiencing an issue that I am a bit out of my depth on. I have a RAID5 array (32TB storage consisting of 3x16TB drives). I use a portion of this array to run my personal Plex server.

One folder should contain ~4000 files, since the issue began I can only see the first 400 files in the folder (in alphabetic order, not just random files). Before the issue occurred, I could see them all.

When I look at the total storage capacity of the array it shows the space as being used, but examining the folder only shows the 400 files combined filesize.

Weirdly, if I go through Plex though I can still stream all my files as if they were still there.

I use SoftRaid8 (on Windows11) and ran error checks on all my drives, it took forever but the drive themselves seem okay.

I did a full rebuild of my array (took 21 days) after Softraid said I had a sync issue but the problem still persists (the volume rebuild actually is hung on 15 seconds remaining for 3 days, but the log file says it actually finished 3 days ago).

Has anyone encountered anything similar? I'm just at a loss for how view the files so I can even pull them off the drive and start over again (likely with a mirrored setup) and I don't want to just continue using the array until I know what the heck is actually wrong.

143
 
 
The original post: /r/datahoarder by /u/dlangille on 2024-12-31 17:40:54.

I'm making more of my private repos available on GitHub. With that now being the primary source, I need to back that up. Fortunately, I found an easy [for me] solution. I was already using gitea as my git repo at home. I created a 'pull mirror' for each repo I want to backup

https://docs.gitea.com/next/usage/repo-mirror#pulling-from-a-remote-repository

That creates a copy in my local gitea instance.

To go one step farther, because this is about backups after all, I did a git pull for each of those repos onto another host:

[17:32 mydev dvl ~/GitHub-backups/FreshPorts] % ls accounts/ docs/ helper_scripts/ periodics/ check_repos/ freshports/ host-init/ vuxml/ daemontools/ freshports-www-offline/ nginx-config/ databases/ git_proc_commit/ packages-import/

I created a new passphrass-less ssh-key pair for use only as a read-only deploy key on those repos. That allows me to use this script to refresh the local working copies on a regular basis:

% cat ~/bin/refresh-GitHub-backups.sh

!/bin/sh
========

REPO\_DIR="/usr/home/dvl/GitHub-backups"

repos=$(find $REPO\_DIR -type d -d 2)

for repo in $repos
do
 cd $repo
 GIT\_SSH\_COMMAND='ssh -i ~/.ssh/read-only-key -o IdentitiesOnly=yes' git pull -q

if [ $? != 0 ] then echo problem in $repo exit 1 fi


done

All of this is store on ZFS filesystems with regular spapshots provided by sanoid. Backups of this directory are stored on another host.

EDIT: grammar

144
 
 
The original post: /r/datahoarder by /u/MuffinsMcSassyPants on 2024-12-31 17:35:35.

I’m following the 3-2-1 rule for backing up my data and I’m not sure where to store my offsite hard drive. All the safety deposit boxes around me are full. I’ve thought about a trusted family members house. Any other ideas?

145
 
 
The original post: /r/datahoarder by /u/Issey_ita on 2024-12-31 17:12:15.

I'm currently putting down a node in my proxmox cluster and I'm thinking yo convert it to a NAS. The HW isn't terrible, i5 4590 16gb ddr3. My budget is low, around 100€, and I'm thinkig to buy 3 used 3TB SAS HDD (about 21€ each) + HBA already flashed in IT mode + SFF-8087 to SFF-8482 cable.

Is this a good idea?

146
 
 
The original post: /r/datahoarder by /u/gnad on 2024-12-31 16:29:14.

So, as i know, SATA SSD and NVME SSD price is pretty much the same currently, with NVME SSD speed being much faster due to not being limited by SATA interface.

I have space to mount a few 2.5 inch drives. And instead of wasting money on 2.5 inch SATA SSD, i think why not buy NVME SSD and put into a 2.5 inch enclosure with USB interface (not SATA interface) and have the option to use NVME drive when i could.

There are a lot of enclosure of this type (usb 3.2/usb4) but I have found none in 2.5 INCH FORM FACTOR which I need to be able to mount it.

Does anyone know if such enclosure exists?

147
 
 
The original post: /r/datahoarder by /u/tetractys_gnosys on 2024-12-31 15:43:46.

Local Walmart recently put all of their internal and external droves on clearance because they're remodeling and probably won't carry the same inventory afterwards. Got a 12TB WD My Book to shuck for $128 (was $255) and then day before yesterday noticed they marked them down to $64. WD Black 4TB internal for $33. Other ludicrous deals. I'm struggling to not go but the rest of they stock even though I don't have my NAS build started.

I know many were getting the drive deals from Walmart months ago but if you didn't, go check yours and see if they've started the clearance deals. Mine was late to the party.

148
 
 
The original post: /r/datahoarder by /u/Kamikazepyro9 on 2024-12-31 15:31:57.

Me and my wife will soon be moving to my family's property to help with the ranch. It's in an area where my only Internet options are

DSL from CenturyLink Wireless ISP from Rise Broadband (current ISP) Starlink maybe, (although when I search the address it says the zone is full)

Both CenturyLink and Rose Broadband put a 350gb data cap on their plans, which when you figure streaming for 4 people, plus all phone calls, and then my job - I assume we're gonna hit that every month easily

Is there a way for me to still share/download my Linux ISOs and other content?

149
 
 
The original post: /r/datahoarder by /u/romeyroam on 2024-12-31 14:57:02.

Here's the issue: I have a pretty sizable collection of video media from a now-defunct source. It's meh quality, and nothing is obscure. I have better-quality copies of much of it, and the stuff I don't is of no interest to anybody. The only real attachment I have to this archive is pure sentimentality, as they were internal to a place I loved.

I am not short on space and in no imminent danger of being so, but I have begun to see no point to keeping them, and I wouldn't mind the 14TB back, if for no other reason than to back up other stuff that I'd like more redundancy on.

My question is *gasp* when is it ok to delete something you definitely don't need, and is lowkey standing in the way of what you'd like to do, but is a memento of times gone by? How do you handle stuff like that? I'm not in a position to just keep adding hardware, but there's no current pressure to find space though. However, I see 14TB sitting there, and it's doing nothing, will continue doing nothing, and hasn't done anything for years.

**the thinking man pose**

150
 
 
The original post: /r/datahoarder by /u/BroccoliNormal5739 on 2024-12-31 14:20:47.

Somehow, like SATA SSDs, I have started collecting M.2 sticks.

I have 2 2TB NVMe, 2 1TB NVMe, and a number of 1TB M.2 SSDs. Yay!

I have seen a dual M.2 carrier to SATA drive footprint. Has anyone used one of these?

Likewise, I have a box of 2.5" SSD drives. Can anyone recommend a chassis for a large number?

view more: ‹ prev next ›