reason why i use a C246 motherboard and other high end expensive components, but such a cheap cpu: someone will sell me used supermicro mobo+ECC ram+SAS HDDs for 500 euro, but no CPU. A Celeron G4900T costs like 15 euro, i was wondering if it could suffice, because this is already overkill for me, at the moment i have no idea on how to use the 36 TB of storage (2x redundancy)
datahoarder
Who are we?
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
We are one. We are legion. And we're trying really hard not to forget.
-- 5-4-3-2-1-bang from this thread
You don't mention your performance requirements and I'm unfamiliar with that CPU. Are you trying to saturate your 1G presumably NIC? Reads or writes?
No, just thousands of small files. Windows takes around a minute to enumerate all the files in the main share via SMB
@Moonrise2473
That looks more like ARC problem, it can hold a large index of the filesystem if you give it enough room in ram, avoiding the need to seek thousands of files on a spinning disk, which takes time. HDDs are fine for sequential operations, Random IO, which is your usecase, is their biggest weakness.
@eleitl
You should be good, then. Probably don't need SSDs for ZIL and L2RC either. Don't forget to schedule a weekly scrub, to catch bit rot. Essential for large drives.
It’s ok until you start using jails or dedup.
ZFS tends to be more RAM intensive so make certain you have, at bare minimum around 16GB. But I would push for more.
@housepanther
As Lawrence said: "It's not ram intensive, it's ram efficient."
It doesn't let ram sit there unused. So you only really need 1G of ram per 1T of storage in general, outside some very rare cases. But the more ram you throw at it, the more snappy it becomes, but there are some diminishing returns. For example, 128G of ram on a 20T array won't be fully utilized most of the time.
L2Arc raises ram requirements, because you also need to store it's index there.
@Moonrise2473