ADSM-L

Re: Backing up directories with 600K files

2001-03-21 10:02:01
Subject: Re: Backing up directories with 600K files
From: "Wayne T. Smith" <ADSM AT MAINE DOT EDU>
Date: Wed, 21 Mar 2001 10:02:33 -0500
Jerry wrote, in part..
> I have a customer who has a large number of small files on a server, and is
> seeing long processing times.  He is running the SIEBEL Help desk
> applications, and ultimately will have 1.5 million files on the server.  The
> test he ran included about 600K files, totaling 5GB of space.  Obviously,
> the individual files are not very big (he says 12K is typical.  They are
> already compressed, so we have compression turned off for this client.

Areas of investigation might include ...

1. Consider turning subfile/adaptive differencing off, if on.

2.  Consider adjusting tuning parameters as they relate to the
    operating system and *SM DB.  For example, maximize TCP settings
    and set transaction/groups sizes very large.  Lots of good
    discussions in the archives of ADSM-L on this subject.

3.  Consider doing some partial incrementals.  For example, maybe the
    restore requirements will allow incremental-by-date during the
    week and then a full-incremental on weekends. *SM manuals discuss
    this well (see the "Using" online manual that comes with the
    client software).

4.  Ensure changing files are not causing "retries". Evaluate restore
    requirements to minimize or eliminate these backup "retries".   If
    you were compressing with *SM, I'd say eliminate it or
    CompressAlways.

5.  If the application allows, restructure the data so that no single
    directory holds a large number of files and the directory
    structure is relatively fixed.

I have no more than anecdotal evidence that any of these points will
help, ... but maybe one will      :-)

cheers, wayne

Wayne T. Smith                          ADSM AT Maine DOT edu
ADSM Technical Coordinator - UNET       University of Maine System