ADSM-L

Re: Large Linux clients

2005-03-29 09:09:54
Subject: Re: Large Linux clients
From: jsiegle <jsiegle AT PSU DOT EDU>
To: ADSM-L AT VM.MARIST DOT EDU
Date: Tue, 29 Mar 2005 09:09:37 -0500
Zoltan,
        I had a similar problem on a Windows box with 5.4 million files. Tivoli
said that I couldn't do the backup/restore with a 32 bit client because
each file in the catalog takes 1k and the 32 bit program could only
address 4 GB of memory. Here is a link they gave me:

http://www-1.ibm.com/support/entdocview.wss?rs=0&context=SSGSG7&q1=1197172&uid=swg21197172&loc=en_US&cs=utf-8&lang=&NotUpdateReferer=

It doesn't quite address your problem if you are only considering 1.4
million files though. You may want to weigh in on virtualmountpoints.
And ironically enough MEMORYEF didn't help at all for the backup part.
I'm going to open a problem with Tivoli on this in May when I get the
scenario setup.

Oh and you probably know to check ulimits.

--
Jonathan

Zoltan Forray/AC/VCU wrote:
Thanks for the suggestion.

We have tried it.   Same results.   Things just go to sleep !




"Mark D. Rodriguez" <mark AT MDRCONSULT DOT COM>
Sent by: "ADSM: Dist Stor Manager" <ADSM-L AT VM.MARIST DOT EDU>
03/28/2005 05:30 PM
Please respond to
"ADSM: Dist Stor Manager" <ADSM-L AT VM.MARIST DOT EDU>


To
ADSM-L AT VM.MARIST DOT EDU
cc

Subject
Re: [ADSM-L] Large Linux clients






Zoltan,

I am not sure if this will fix the problem or not.  I have seen in the
past when trying to backup directories (including sub-directories) with
a large number of files that the system runs out of memory and either
fails or hangs for ever.  The one thing that I have done and has worked
in some cases is to use the MEMORYEFfecientbackup option.  It is a
client side option and can be placed in the option file or called from
the command line.  I would try it and see if it helps.  BTW, there is a
downside to this and that is that backups will be slow however slow is
still faster than not at all!

Let us know if that helps.

--
Regards,
Mark D. Rodriguez
President MDR Consulting, Inc.

===============================================================================
MDR Consulting
The very best in Technical Training and Consulting.
IBM Advanced Business Partner
SAIR Linux and GNU Authorized Center for Education
IBM Certified Advanced Technical Expert, CATE
AIX Support and Performance Tuning, RS6000 SP, TSM/ADSM and Linux
Red Hat Certified Engineer, RHCE
===============================================================================



Zoltan Forray/AC/VCU wrote:


I am having issues backing up a large Linux server (client=5.2.3.0).

The TSM server is also on a RH Linux box (5.2.2.5).

This system has over 4.6M objects.

A standard incremental WILL NOT complete successfully. It usually
hangs/times-out/etc.

The troubles seem to be related to one particular directory with
40-subdirs, comprising 1.4M objects (from the box owner).

If I point to this directory as a whole (via the web ba-client), and try
to back it up in one shot, it displays the "inspecting objects" message
and then never comes back.

If I drill down further and select the subdirs in groups of 10, it seems
to back them up, with no problem.

So, one question I have is, anyone out there backing up large Linux
systems, similar to this ?

Any suggestions on what the problem could be.

Currently, I do not have access to the error-log files since this is a
protected/firewalled system and I don't have the id/pw.




<Prev in Thread] Current Thread [Next in Thread>