ADSM-L

Re: [ADSM-L] million files backup

2012-02-02 12:57:49
Subject: Re: [ADSM-L] million files backup
From: Dan Olson <dolson AT MCS.ANL DOT GOV>
To: ADSM-L AT VM.MARIST DOT EDU
Date: Thu, 2 Feb 2012 11:40:03 -0600
I'm able to backup several systems with 100's of millions of files with the 
diskcache method.

If your backup is running that slow on a gpfs filesystem, you probably have a 
performance
problem with the filesystem.   Can you give a quick over view of the gpfs 
environment?
ie sata, sas, or are they fc attached?  in what topology?  what interconnect on 
the nodes?

----
Daniel Murphy-Olson
Systems Administrator
Mathematics & Computer Science Division
Argonne National Laboratory
630-252-0055

----- Original Message -----
From: "Howard Coles" <Howard.Coles AT ARDENTHEALTH DOT COM>
To: ADSM-L AT VM.MARIST DOT EDU
Sent: Thursday, February 2, 2012 11:11:07 AM
Subject: Re: [ADSM-L] million files backup

Have you tried using the memoryefficient=diskcache method?  I'm assuming you 
have.  Takes a while, but on fast systems I've had pretty decent results.  You 
may need to go the virtual mount point route though with a file system that 
large.


See Ya'
Howard Coles Jr.
John 3:16!


-----Original Message-----
From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf Of 
Jorge Amil
Sent: Thursday, February 02, 2012 9:06 AM
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Re: [ADSM-L] million files backup

Hi Jim,

thank you very much for your answer.

Actually we are doing what you say. Filesystem .tar. It was a great solution 
when the filesystem was 500Gb-1Tb but actually our filesystem is 14Tb. The
 tar/gzip command took 10-12 days... :(

So we need another aproach

Thanks
Jorge

> Date: Thu, 2 Feb 2012 08:47:24 -0600
> From: jschneider AT USSCO DOT COM
> Subject: Re: [ADSM-L] million files backup
> To: ADSM-L AT VM.MARIST DOT EDU
>
> Jorge,
>
> On Unix systems:
> I've done it in two steps.  Create a tar file of the file system and zip
> it.  Create a second file listing all the files in the tarred directory.
> The tar extract command allows single files to be recalled if the
> absolute path name is available.
>
> I've used this to backup a Sterling Commerce flat file database with
> multiple subdirectories holding more than 1.5 million files.  The
> tar/gzip command took 5 or 6 hours for a 500 GB file system.
>
> I have not tried to do this on a Windows system.
>
> Jim Schneider
>
> -----Original Message-----
> From: ADSM: Dist Stor Manager [mailto:ADSM-L AT vm.marist DOT edu] On Behalf 
> Of
> Jorge Amil
> Sent: Thursday, February 02, 2012 8:30 AM
> To: ADSM-L AT vm.marist DOT edu
> Subject: [ADSM-L] million files backup
>
> Hi everybody,
>
> Does anyone know what is the best way to make a filesystem backup than
> contains million files?
>
> Backup image is not posible because is a GPFS filesystem and is not
> supported.
>
> Thanks in advance
>
> Jorge
>

DISCLAIMER: This communication, along with any documents, files or attachments, 
is intended only for the use of the addressee and may contain legally 
privileged and confidential information. If you are not the intended recipient, 
you are hereby notified that any dissemination, distribution or copying of any 
information contained in or attached to this communication is strictly 
prohibited. If you have received this message in error, please notify the 
sender immediately and destroy the original communication and its attachments 
without reading, printing or saving in any manner. Please consider the 
environment before printing this e-mail.