How to determine size and required volumes from a particular backup?

ldmwndletsm

ADSM.ORG Senior Member
Joined
Oct 30, 2019
Messages
232
Reaction score
5
Points
0
PREDATAR Control23

Is there a way to get a listing of the number of bytes of data that was backed up for a file space on a specific night? And if so then how do you determine which tape it's on?

I can determine which volumes a file space is on (https://www.ibm.com/support/pages/identifying-which-volume-filespace-stored), but this doesn't tell me 1. which volume has the backup from a specific date and 2. how much data, if any, was backed up for that file space on that date.

I know I did a test restore one time wherein I did NOT perform a point in time recovery. I was not trying to rebuild the directory to the way it looked at a particular point in time. Instead, I wanted to only restore whatever was backed up under that directory on that particular night. Maybe I'm not remembering correctly, but I thought I did this wherein I specified a date range or some such thing. But I don't recall if there was a way to determine the required tapes prior to running the restore.

I would like to generate a list of the sizes of the backups for a particular file space so I can select one to return from off-site for a DR test. I don't want a backup instance that's too big or too small. Would be nice to be able to generate such a list along with the number of files backed up, and the associated volume, natch.
 
PREDATAR Control23

Is there a way to get a listing of the number of bytes of data that was backed up for a file space on a specific night?
Yes: https://thobias.org/tsm/sql/#toc171
And if so then how do you determine which tape it's on?
In theory yes, but it would be a long messy SQL query. It would basically be a combination of the query above, with a join on the contents table. Unless you are doing FULL backups, last night's backup is not what you would restore anyway during a DR. It would be a collection of tapes from the past several months.

You are better off to use:
q nodedata {node_name} stgpool={offsite_pool_name}


I would like to generate a list of the sizes of the backups for a particular file space so I can select one to return from off-site for a DR test. I don't want a backup instance that's too big or too small. Would be nice to be able to generate such a list along with the number of files backed up, and the associated volume, natch.
If you really want to do a small test with just a small subset of data, I recommend you do the following:
  1. stop ALL client backups and archives
  2. do a backup stgpool of all primary pools and send tapes offsite (that's to get a clean slate)
  3. do a FULL backup of NODE_A (incremental won't work)
  4. do a backup stgpool of the pool(s) where NODE_A backs up
  5. do a database backup
  6. grab the tapes from step 5 and step 6 to do your DR test
The test above is valid for a small DR-test, but it's not really representative of real life.
 
PREDATAR Control23

With DB2 it's not too bad. If you want how much was backed up for a specific Filespace use below.

select a.filespace_name, sum(b.BFSIZE/1024) as BACKUP_KB from backups a, backup_objects b where a.BACKUP_DATE between '2020-06-18 15:00' and '2020-06-19 10:00' and a.node_name ='CASPER' and a.object_id=b.OBJID group by a.filespace_name

or

select a.filespace_name, sum(b.BFSIZE/1024) as BACKUP_KB from backups a, backup_objects b where a.BACKUP_DATE >=(current_timestamp - 24 hours) and a.node_name ='CASPER' and a.object_id=b.OBJID group by a.filespace_name

Of course you can change the result to MB or GB based on adding more division to the BFSIZE calculation.
 
Top