I've seen the problem on clients with as few as 10m files.
When you do a typical incremental backup, the first thing that happens is
that the server pushes down to the client a list of the "active" backup
The problem is with the client handling that list.
I'm surprised you aren't also having problems with the client completing
backups on time, having to navigate a filesystem that large.
To fix the memory issue, the quick and dirty is MEMORYEFFICIENT
That lets the client use disk space to store that humongous list. Never had
it cause a problem, nor does it seem to slow things down that much.
If you still have issues, and the daily file change rate is low, then go to
On Thu, Oct 9, 2008 at 11:23 AM, Zoltan Forray/AC/VCU <zforray AT vcu DOT
> Is there a "realistic" maximum number of files a Windows client can handle
> before having this kind of problem?
> This system has 30-40 MILLION objects/files.
> Not sure if this error is caused by the large number of files or a client
> problem. They are currently running the 22.214.171.124 client. Recommended at
> least going to the 126.96.36.199 level before we troubleshoot this problem,
> further, since they are also getting the dreaded "
> RC = 13." error which IIRC, the 5.5.1.x client resolves/addresses.
> If this doesn't help, what other recommendations are there to handle this
> situation? MEMORYEFFICIENTBACKUP ? Multiple backup/node definitions to
> run different backups for each drive (1-drive has 26M and another has