1. Forum Rules (PLEASE CLICK HERE TO READ BEFORE POSTING) Click the link to access ADSM.ORG Acceptable Use Policy and forum rules which should be observed when using this website. Violators may be banned from this website. This message will disappear after you have made at least 12 posts. Thank you for your cooperation.

SQL querry Data backedup each year/month

Discussion in 'Scripting' started by tsmdumber, Feb 21, 2011.

  1. tsmdumber

    tsmdumber New Member

    Joined:
    Jan 27, 2011
    Messages:
    108
    Likes Received:
    3
    Hi

    Is it possible to write a querry to find out how much data was backedup/archived for a node in a particular month. The months are beyond a year or 2.

    NODENAME DATABACKEDUP MONTH/YEAR

    Regards,
    TSMDUMBER
     
  2.  
  3. GregE

    GregE Senior Member

    Joined:
    May 12, 2006
    Messages:
    2,100
    Likes Received:
    31
    It's possible. :)

    I run this as a daily report but this may get you going in the right direction.

    Code:
    SELECT - 
       n.node_name||'  ' as NODE, -
       n.platform_name as PLATFORM, -
       su.activity,sum(cast(su.bytes/1024/1024/1024 as decimal(6,2))) as GB, -
       su.affected -
    FROM nodes n, summary su -
    WHERE -
       (end_time between current_timestamp - 24 hours and current_timestamp) -
       and su.activity='BACKUP' -
       and ((n.node_name=su.entity)) - 
    GROUP BY -
       n.node_name, -
       n.platform_name, -
       su.activity, -
       su.affected - 
    ORDER BY - 
       platform, -
       GB, -
       node asc
    
     
  4. tsmdumber

    tsmdumber New Member

    Joined:
    Jan 27, 2011
    Messages:
    108
    Likes Received:
    3
    Thanks Greg. I am looking for something which can go back to a year!!with actlog retention of 100 days querryinf from summary tabel will not help!!!

    In which table is the size of a file stored?

    I found something like the below form the internet like this

    select sum(BYTES)/1024/1024/1024 "Data_Backup GB" from summary where start_time>(current_timestamp - 1 days) and End_Time< (current_timestamp) and ACTIVITY in ('BACKUP','ARCHIVE') and ENTITY='______'

    but i want the perido inbetween 2 years, sayy nov 2008 to nov 2009?
     
  5. ad283m

    ad283m New Member

    Joined:
    Sep 22, 2010
    Messages:
    152
    Likes Received:
    1
    What about the table ARCHIVES?? and the table BACKUP??

    Those might do the trick for you.

    But be careful, according to the database size and those tables can be very big and to query them can be time consuming.
     
  6. tsmdumber

    tsmdumber New Member

    Joined:
    Jan 27, 2011
    Messages:
    108
    Likes Received:
    3
    the tables archives nad backups do not have the bytes column or anything related to size

    tsm: TSM>select * from backups where node_name =
    ANR2963W This SQL query may produce a very large result table, or may require a significant amount of time to compute.

    Do you wish to proceed? (Yes (Y)/No (N)) y

    NODE_NAME:
    FILESPACE_NAME: /adsmorc
    FILESPACE_ID: 1
    STATE: ACTIVE_VERSION
    TYPE: FILE
    HL_NAME: //
    LL_NAME: 00jttadl_1_1
    OBJECT_ID: 110103009
    BACKUP_DATE: 2008-10-23 23:13:58.000000
    DEACTIVATE_DATE:
    OWNER:
    CLASS_NAME: DEFAULT

    tsm: TSM>select * from occupancy where node_name=

    NODE_NAME:
    TYPE: Bkup
    FILESPACE_NAME:
    STGPOOL_NAME:
    NUM_FILES: 54116
    PHYSICAL_MB: 48613062.85
    LOGICAL_MB: 48613062.85
    FILESPACE_ID: 1

    still the big question is In which table is the size of a file stored?

    Regards,
    TSMDUMBER
     
  7. ad283m

    ad283m New Member

    Joined:
    Sep 22, 2010
    Messages:
    152
    Likes Received:
    1
    Ok, I get it.

    Well there is the CONTENTS table, it has the FILE_NAME and the FILE_SIZE columns, also the NODE_NAME.

    Again this table is even bigger than the previous one I told you, once I had to a query and it was really slow, so much that I decided to go with occupancy.

    Regards.
     

Share This Page