Given a whole dump blob that has landed in holding disk, I can look at
it (for example, to answer the question "Why is that so _big_?") with:
dd if=foo.verilab.com._.1 bs=32k skip=1 | tar tfv - | sort +2nr | head
This is pretty well documented.
What if, instead, my dump blob is "chunked", as in this case (1GB chunks):
% ls -ltr foo.verilab.com._.__disc1.*
-rw------- 1 amanda disk 1073741824 Nov 19 00:39 foo.verilab.com._.__disc1.1
-rw------- 1 amanda disk 123912192 Nov 19 00:40
foo.verilab.com._.__disc1.1.1
I can certainly do the same 'dd if=' game on the initial 1GB chunk,
but what if I want to find the biggest thing across the whole blob (2
files in this case)? Thanks,
Will
|