datahoarder
Who are we?
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
We are one. We are legion. And we're trying really hard not to forget.
-- 5-4-3-2-1-bang from this thread
view the rest of the comments
@housepanther
As Lawrence said: "It's not ram intensive, it's ram efficient."
It doesn't let ram sit there unused. So you only really need 1G of ram per 1T of storage in general, outside some very rare cases. But the more ram you throw at it, the more snappy it becomes, but there are some diminishing returns. For example, 128G of ram on a 20T array won't be fully utilized most of the time.
L2Arc raises ram requirements, because you also need to store it's index there.
@Moonrise2473