Whenever we have a situation like this we always tar.gz into one file first,
before archiving to TSM .. either the top level directory or multiple lower
level directories, if it makes sense to do so, based on what the retrieval
expectations might be.
Also very useful is to tar.gz verbosely, write that out to a file and archive
that file along with the gz. That way, later, you can get a reference of what
was in the archive without having to retrieve and extract the gz file. And
demonstrate to auditors that the files were archived successfully.
--
Graham Stewart
Network and Storage Services Manager
Information Technology Services
University of Toronto Libraries
416-978-6337
If there is some high level structureOn Jan 20, 2017, at 09:22, Bo Nielsen
<boanie AT DTU DOT DK<mailto:boanie AT DTU DOT DK>> wrote:
Hi all,
I need advice.
I must archive 80 billion small files, but that is not possible, as I see it.
since it will fill in the TSM's Database about 73 Tb.
The filespace is mounted on a Linux server.
Is there a way to pack/zip the files, so it's a smaller number of files.
anybody who has tried this ??
Regards,
Bo Nielsen
IT Service
Technical University of Denmark
IT Service
Frederiksborgvej 399
Building 109
DK - 4000 Roskilde
Denmark
Mobil +45 2337 0271
boanie AT dtu DOT dk<mailto:boanie AT dtu DOT dk>
|