It's A Digital Disease!

11 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
251
 
 
The original post: /r/datahoarder by /u/Deadboy90 on 2025-01-01 00:04:16.
252
 
 
The original post: /r/datahoarder by /u/bob-bulldog-briscoe on 2024-12-31 23:16:26.

I'm looking for some high-quality, archival-grade BD-R discs. I'm all set on CD-R and DVD-R as I have a bunch of old-stock TY on hand from years ago, as well as some Verbatim M-Disc DVD-R. I was going to buy some Verbatim M-Disc BD-R but I read a thread that it's just organic dye now and that it's "M-Disc" in name only. What are some good alternatives? Thanks for the advice!

253
 
 
The original post: /r/datahoarder by /u/Extension_Cat6683 on 2024-12-31 21:43:33.

I come across a lot of highly rated sellers on several market places for used or new products in Germany and Netherlands for example. They sell WD UltraStar/ Seagate Iron wolfs 4-20 TB capacities for literally half the price of a new lne..

Most of the sellers have 500-1000 5star reviews..proper businesses...VAT invoices..they also give 2-3yr warranty on these HDDs and clearly claim 0Hr/0TB on these drives...they say these are overstock from OEMs....

But I don't buy this reasoning...

So my question is.... what's the catch... perhaps you guys know.

Rejects from OEMs, datacenters, old HDD over written with are FW, leaking Helium seals?

254
 
 
The original post: /r/datahoarder by /u/fdjadjgowjoejow on 2024-12-31 16:40:24.

Anyone have a recommendation for a keychain flash drive? It will be BitLocker'ed if that is relevant. 32GB is enough. The only thorough post I found was from 6 years ago

https://old.reddit.com/r/DataHoarder/comments/9azer0/fastlargecompact_keychain_friendly_usbs/

I checked Amazon as well. TIA.

https://www.amazon.com/s?k=keychain+flash+drive

255
 
 
The original post: /r/datahoarder by /u/two_letters on 2024-12-31 16:18:59.

I’m a photographer with about 20-24TB so far. Every year I add 5-6TB.

I currently have a full 16TB 4bay Pegasus in RAID 5 that is backed up to a 16TB OWC Mercury and Backblaze. I also have another 12TB Mercury for cold storage that’s running low. The Mercurys are USB 3 and sloooow.

I’m thinking about getting the 32 or 40TB thunderbolt Gemini to consolidate everything and speed things up. The extra ports are helpful too.

It would back up the RAID via CCC and allow for extra storage. The Mercurys would both be cold storage. Of course everything would be backed up to Backblaze.

Now here’s the dumb questions. If I put the 32TB Gemini in Independent mode, would it mount 2 16TB drives separately on my Mac screen? Or one drive that is 32TB? Can I daisy chain the Pegasus to the Gemini or is that a bad idea?

Any other suggestions to get more space and speed? I edit on an SSD so that’s not an issue.

256
 
 
The original post: /r/datahoarder by /u/kagein12 on 2024-12-31 13:47:33.

I’m in the process of moving all my data off iCloud and onto local storage, and I need a way to store everything reliably for 40+ years. I’ve got over a terabyte of photos and videos of my kids that I really want them to be able to access in the future. I was originally planning on using 100 GB BDXL discs, but since they need specialized drives, I’m worried those drives won’t be easily available down the road, which might make the data impossible to read. Meanwhile, regular 50 GB BD-R discs can be read by any standard Blu-ray player, and I figure those will still be kicking around decades from now.

So, is there a better way to “cold store” my data with some future-proofing, especially since my storage needs are just going to keep growing? Any advice would be appreciated.

*edit*

I am also considering the possibility (morbidly) that i might drop dead at any moment so a certain level of set and forget i feel is necessary.

257
 
 
The original post: /r/datahoarder by /u/WarmGeogre69 on 2024-12-31 20:54:31.

Working on a comic book collection, and comic book files are basically .zip/rar files with jpg's inside. I'm looking for a tool, that can automatically take images out of the archives and compress them. It takes a lot of time, extracting, compressing and archiving each book individually.

258
 
 
The original post: /r/datahoarder by /u/whosenose on 2024-12-31 20:01:12.

I’m buying two new 16 TB drives for my Synology server, and I’d like to test them for errors before adding them to my RAID volume. All I have available is the Synology, a Windows and a Mac laptop, a miniPC Ubuntu server (no monitor) and various Raspberry Pis including 5s running Pi Debian (and no desktops).

I’m comfortable with Linux, MacOS and Windows. What would be my best option to pre-test these drives before adding them? Would I need to buy an adapter to connect them? How long might this take?

259
 
 
The original post: /r/datahoarder by /u/Professional-Rock-51 on 2024-12-31 19:42:52.

I have a Windows Storage Spaces mirror (two drives) formatted as NTFS on my desktop with all onboard SATA controllers used. I would like to replace these drives and then selectively copy data from one of them attached to an external SATA->USB adapter.

What should I know before attempting this? I know that when it's a single drive that attaching it by USB "just works" but I've never tried doing this with a drive that was used on a Storage Spaces mirror. Is there a special process that is required to import the disk so that it can been seen by the system or will it detect and work automatically? I want to figure out a process before I actually disconnect either of these disks to minimize any problems.

260
 
 
The original post: /r/datahoarder by /u/Cloudssj43 on 2024-12-31 19:03:47.

Operating system:

Ubuntu Server 24.04.1 LTS (Running it as a NAS/homeserver)

Disk Setup:

/ & /home -> 500 GB SSD

/mnt/hdd1 -> 12 TB ext4 partition

/mnt/hdd2 -> 12 TB ext4 partition

/mnt/hdd3 -> 12 TB ext4 partition

/mnt/storage -> MergerFS of hdd1 and hdd2

Software

Snapraid of hdd1 and hdd2 with hdd3 as the parity drive

rsnapshot of some folders in /mnt/storage stored on hdd3

Samba running that gives access to /mnt/storage

Jellyfin/QBittorrent/Sonarr running as docker containers that has /mnt/storage mounted

hd-idle configured to sleep drives after 30min of inactivity (Testing shows drives spin up on avg 3-4 times in 1 day)

Status:

Everytime I navigate the NAS filestystem in Windows, both hdd1 and hdd2 spin up just to read the filesystem metadata

Everytime I open Jellyfin's homepage, it tries to access all the thumbnail .jpg files.

/mnt/hdd1 (Which currently stores all my Qbittorrent data) is constantly spinning since I'm seeding some stuff.

Question:

I have a 2TB SSD that I have the option of putting into my NAS and the ideal scenario would be it being used as a cache such that the most commonly used items get stored onto it so that filesystem access is limited to only the SSD. Is there a way to get such a configuration?

I looked up lv cache (but i think it requires me to have my hdds in a LVM group which i have not?) as well as bcache, but I'm a little out of my depth trying to understand how they work. Any advice?

Edit: I'm fine if when I stream my shows/movies from Jellyfin I spin up a drive, I mostly want the Samba/Qbittorrent/Jellyfin thumbnails to not spin up my drives.

261
 
 
The original post: /r/datahoarder by /u/idkwhattowritehere on 2024-12-31 18:39:14.

I consider Scribd's way of functioning not morally correct, so I tried to repair that.

If you want to get rid of that annoying blur, just download this extension. (DESKTOP ONLY, CHROMIUM-BASED BROWSER)

Scribd4free — Bye bye paywall on Scribd :D

262
 
 
The original post: /r/datahoarder by /u/Gnomefort on 2024-12-31 18:03:04.

I am experiencing an issue that I am a bit out of my depth on. I have a RAID5 array (32TB storage consisting of 3x16TB drives). I use a portion of this array to run my personal Plex server.

One folder should contain ~4000 files, since the issue began I can only see the first 400 files in the folder (in alphabetic order, not just random files). Before the issue occurred, I could see them all.

When I look at the total storage capacity of the array it shows the space as being used, but examining the folder only shows the 400 files combined filesize.

Weirdly, if I go through Plex though I can still stream all my files as if they were still there.

I use SoftRaid8 (on Windows11) and ran error checks on all my drives, it took forever but the drive themselves seem okay.

I did a full rebuild of my array (took 21 days) after Softraid said I had a sync issue but the problem still persists (the volume rebuild actually is hung on 15 seconds remaining for 3 days, but the log file says it actually finished 3 days ago).

Has anyone encountered anything similar? I'm just at a loss for how view the files so I can even pull them off the drive and start over again (likely with a mirrored setup) and I don't want to just continue using the array until I know what the heck is actually wrong.

263
 
 
The original post: /r/datahoarder by /u/dlangille on 2024-12-31 17:40:54.

I'm making more of my private repos available on GitHub. With that now being the primary source, I need to back that up. Fortunately, I found an easy [for me] solution. I was already using gitea as my git repo at home. I created a 'pull mirror' for each repo I want to backup

https://docs.gitea.com/next/usage/repo-mirror#pulling-from-a-remote-repository

That creates a copy in my local gitea instance.

To go one step farther, because this is about backups after all, I did a git pull for each of those repos onto another host:

[17:32 mydev dvl ~/GitHub-backups/FreshPorts] % ls accounts/ docs/ helper_scripts/ periodics/ check_repos/ freshports/ host-init/ vuxml/ daemontools/ freshports-www-offline/ nginx-config/ databases/ git_proc_commit/ packages-import/

I created a new passphrass-less ssh-key pair for use only as a read-only deploy key on those repos. That allows me to use this script to refresh the local working copies on a regular basis:

% cat ~/bin/refresh-GitHub-backups.sh

!/bin/sh
========

REPO\_DIR="/usr/home/dvl/GitHub-backups"

repos=$(find $REPO\_DIR -type d -d 2)

for repo in $repos
do
 cd $repo
 GIT\_SSH\_COMMAND='ssh -i ~/.ssh/read-only-key -o IdentitiesOnly=yes' git pull -q

if [ $? != 0 ] then echo problem in $repo exit 1 fi


done

All of this is store on ZFS filesystems with regular spapshots provided by sanoid. Backups of this directory are stored on another host.

EDIT: grammar

264
 
 
The original post: /r/datahoarder by /u/MuffinsMcSassyPants on 2024-12-31 17:35:35.

I’m following the 3-2-1 rule for backing up my data and I’m not sure where to store my offsite hard drive. All the safety deposit boxes around me are full. I’ve thought about a trusted family members house. Any other ideas?

265
 
 
The original post: /r/datahoarder by /u/Issey_ita on 2024-12-31 17:12:15.

I'm currently putting down a node in my proxmox cluster and I'm thinking yo convert it to a NAS. The HW isn't terrible, i5 4590 16gb ddr3. My budget is low, around 100€, and I'm thinkig to buy 3 used 3TB SAS HDD (about 21€ each) + HBA already flashed in IT mode + SFF-8087 to SFF-8482 cable.

Is this a good idea?

266
 
 
The original post: /r/datahoarder by /u/gnad on 2024-12-31 16:29:14.

So, as i know, SATA SSD and NVME SSD price is pretty much the same currently, with NVME SSD speed being much faster due to not being limited by SATA interface.

I have space to mount a few 2.5 inch drives. And instead of wasting money on 2.5 inch SATA SSD, i think why not buy NVME SSD and put into a 2.5 inch enclosure with USB interface (not SATA interface) and have the option to use NVME drive when i could.

There are a lot of enclosure of this type (usb 3.2/usb4) but I have found none in 2.5 INCH FORM FACTOR which I need to be able to mount it.

Does anyone know if such enclosure exists?

267
 
 
The original post: /r/datahoarder by /u/tetractys_gnosys on 2024-12-31 15:43:46.

Local Walmart recently put all of their internal and external droves on clearance because they're remodeling and probably won't carry the same inventory afterwards. Got a 12TB WD My Book to shuck for $128 (was $255) and then day before yesterday noticed they marked them down to $64. WD Black 4TB internal for $33. Other ludicrous deals. I'm struggling to not go but the rest of they stock even though I don't have my NAS build started.

I know many were getting the drive deals from Walmart months ago but if you didn't, go check yours and see if they've started the clearance deals. Mine was late to the party.

268
 
 
The original post: /r/datahoarder by /u/Kamikazepyro9 on 2024-12-31 15:31:57.

Me and my wife will soon be moving to my family's property to help with the ranch. It's in an area where my only Internet options are

DSL from CenturyLink Wireless ISP from Rise Broadband (current ISP) Starlink maybe, (although when I search the address it says the zone is full)

Both CenturyLink and Rose Broadband put a 350gb data cap on their plans, which when you figure streaming for 4 people, plus all phone calls, and then my job - I assume we're gonna hit that every month easily

Is there a way for me to still share/download my Linux ISOs and other content?

269
 
 
The original post: /r/datahoarder by /u/romeyroam on 2024-12-31 14:57:02.

Here's the issue: I have a pretty sizable collection of video media from a now-defunct source. It's meh quality, and nothing is obscure. I have better-quality copies of much of it, and the stuff I don't is of no interest to anybody. The only real attachment I have to this archive is pure sentimentality, as they were internal to a place I loved.

I am not short on space and in no imminent danger of being so, but I have begun to see no point to keeping them, and I wouldn't mind the 14TB back, if for no other reason than to back up other stuff that I'd like more redundancy on.

My question is *gasp* when is it ok to delete something you definitely don't need, and is lowkey standing in the way of what you'd like to do, but is a memento of times gone by? How do you handle stuff like that? I'm not in a position to just keep adding hardware, but there's no current pressure to find space though. However, I see 14TB sitting there, and it's doing nothing, will continue doing nothing, and hasn't done anything for years.

**the thinking man pose**

270
 
 
The original post: /r/datahoarder by /u/BroccoliNormal5739 on 2024-12-31 14:20:47.

Somehow, like SATA SSDs, I have started collecting M.2 sticks.

I have 2 2TB NVMe, 2 1TB NVMe, and a number of 1TB M.2 SSDs. Yay!

I have seen a dual M.2 carrier to SATA drive footprint. Has anyone used one of these?

Likewise, I have a box of 2.5" SSD drives. Can anyone recommend a chassis for a large number?

271
 
 
The original post: /r/datahoarder by /u/gryponyx on 2024-12-31 11:48:59.

Bought a new internal hdd. Is full disk encryption recommended or use encrypted containers only with veracrypt? If i download something and then transfer it over to the container, won't there be traces of what i downloaded on the unencrypted part of the hard drive?

272
 
 
The original post: /r/datahoarder by /u/juste_k3nkai on 2024-12-31 11:17:23.

New to using this extension, is there anyway to automatically download files? I'm trying to download a board from pinterest and manually saving each is going to take a while and kinda defeats the purpose of getting the extension.

273
 
 
The original post: /r/datahoarder by /u/randopop21 on 2024-12-31 09:49:33.

I know that back in the past, a quick format in Windows was not a good test for media. For a long time, there has been a "long" format (i.e. not a "quick" format) but I've never used it (i.e. didn't trust it).

Is it sufficient to test media like a USB stick? How about a hard drive or SSD?

My current need is to test some USB sticks on which I will store Linux ISOs. Yes, ACTUAL Linux ISOs.

To be specific, I will create a Ventoy Boot USB stick and put on various distros that I'd like to try. I'd be best if I could count on the ISO to be exact and contain no bad bits. So I want to test the USB stick.

By the way, I've found that via Task Manager, I can see the current data transfer rate for a particular hard drive. So in the case of my USB stick, I'm getting 20MB/sec write transfer rate. Is this good for a USB 3.0 stick on a USB 3.0 port? It's a cheap off-shore 32GB stick that claims to be a name brand. I'm dubious about it but I just need some reliable storage, not max performance.

274
 
 
The original post: /r/datahoarder by /u/Andeh86 on 2024-12-31 09:39:32.

Hi, I hope someone will be best to advise me what to do with my predicament, I'm at a bit of a loss and don't know where to look.

I currently have 7 drives connected to my windows machine. 6 internal one external

4 x HDD (1 external) 3 x SSD

I'm going to be getting a Mac mini (it's the cheapest and best solution for me to upgrade my current set-up as all I do is edit vlogs and that will work perfectly). However I'll need to be able to access all 3 years worth of footage from my drives. They're already exfat so Mac compatible, but I will only have 5 usb-c (3 thunderbolt) ports available to me and I'll need one of them for my monitor.

What would be your best advice or is there a unit I can get that would allow me to access them all easily?

Personal note: I really enjoy getting deep into the nerdy specs of this stuff but right now I'm currently on meds for depression and getting really overwhelmed looking at it all so could just do with some guidance.

Thanks in advance 🙏

275
 
 
The original post: /r/datahoarder by /u/andreas0069 on 2024-12-31 09:33:48.

Original Title: I recently got my hands on 0.5PB of drives! 50x 10TB SAS disks, Seller had no clue, and did not care much, got them as a bargain, they were 520 Block size, so I made a guide on how to make it 512 Block size!

view more: ‹ prev next ›