ADSM-L

Re: Space Reclamation Eating Tapes

2005-11-28 09:53:06
Subject: Re: Space Reclamation Eating Tapes
From: "Bos, Karel" <Karel.Bos AT ATOSORIGIN DOT COM>
To: ADSM-L AT VM.MARIST DOT EDU
Date: Mon, 28 Nov 2005 15:52:55 +0100
Things have changed in 5.3.

-----Original Message-----
From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf Of
Dennis Melburn W IT743
Sent: maandag 28 november 2005 15:35
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Re: Space Reclamation Eating Tapes

Don't think that is it, I only have 2 storage pools out of like 12 that
have collocation turned on (these storage pools are for large file
servers).  So collocation shouldn't be affecting most of them. 


Mel Dennis
Systems Engineer - IT743
Siemens Power Generation
4400 Alafaya Trail
Orlando, FL 32826
MC Q1-110
Tel:  (407) 736-2360
Win:  439-2360
Fax: (407) 736-5069
Email:  melburn.dennis AT siemens DOT com

-----Original Message-----
From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf Of
Bos, Karel
Sent: Monday, November 28, 2005 9:30 AM
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Re: [ADSM-L] Space Reclamation Eating Tapes

Sounds like a collocation problem. Q stg / q node / q collocgroup.
 
Regards,
Karel
-----Original Message-----
From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf Of
Dennis Melburn W IT743
Sent: maandag 28 november 2005 15:22
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Space Reclamation Eating Tapes

I recently migrated our Windows 2K3 TSM server from 5.2.1.3 to 5.3.2.0
and since then, it seems that whenever I kick off space reclamation for
my primary tape storage pools, it eats up scratch tapes, instead of
freeing them up.  Is there a reason for this?  I understand that
occasionally TSM will need a scratch tape to combine other tapes, but it
should then free those other tapes up and return them to the scratch
pool.  I've checked the reuse delay on the storage pools, and they are
set to 0, so I know that isn't the problem.
 
 
Mel Dennis

<Prev in Thread] Current Thread [Next in Thread>