Veritas-bu

Re: [Veritas-bu] Ideal Netbackup solution for backing up millions of files

2014-09-29 05:58:47
Subject: Re: [Veritas-bu] Ideal Netbackup solution for backing up millions of files
From: William Brown <william.d.brown AT gsk DOT com>
To: "VERITAS-BU AT MAILMAN.ENG.AUBURN DOT EDU" <VERITAS-BU AT mailman.eng.auburn DOT edu>
Date: Mon, 29 Sep 2014 09:58:18 +0000

There is no point using SAN client without fibre channel; it can fail back to using Ethernet but then you are back with the function of the Standard Client.  As was said by Michael, it will not help as it is all about fast transmission with lower CPU overhead than Ethernet; it is best used for servers that have spare fibre connections and cannot be fitted with 10gbE.  Your problem is the time taken to scan the file system, both to find the files to include in incrementals and also the resetting of the access time after the backup has read the files.

 

So you need a solution that doesn’t do that scan.  Rusty has given you good pointers.  The choices may be determined by what kind of restores you might want to do.  Your scheme with incrementals would take a very long time for a full restore.

 

Backing up a VM can be very efficient – I’ve no experience of it but I understand there are lots of features for backing up changed blocks only.  However, for best value you want to be sending the data to some kind of deduplicated store.

 

And I should remind you that support from Microsoft for WS2003 ends next year...

 

You may buy a small amount of backup time (at the possible cost of longer single-file restore times) by using a touch file DONT_SORT_DIR.  I’ve not seen it talked about recently but it was a common recommendation ten or so years ago for NetBackup with scenarios like yours.  Most recent that I can find is here: http://www.symantec.com/connect/forums/client-configuration-option.  I think that in your situation you would create this on the WS2003 client in the \Program Files\VERITAS\netbackup folder.   It prevents NetBackup sorting the list of files to backup for the Full backup, which can take a very long time with lots of files.  I don’t think it will save much for an incremental, unless there are lots of new files every day/week as it has no choice but to look at every file’s modification date.  It would be easy to try and as easy to remove, and costs no money and little effort.  The impact I assume is that on a restore of one or a few files you might have to leave it to read right through the backup to find the files, where if it is sorted all the files for one folder will be together.  But like any restore, once it has got back what you wanted you can kill it.

 

 

 

William Brown

 

From: veritas-bu-bounces AT mailman.eng.auburn DOT edu [mailto:veritas-bu-bounces AT mailman.eng.auburn DOT edu] On Behalf Of Rusty Major
Sent: 28 September 2014 00:51
To: VERITAS-BU AT MAILMAN.ENG.AUBURN DOT EDU
Subject: Re: [Veritas-bu] Ideal Netbackup solution for backing up millions of files

 

Hi,

 

You could try to use some sort of zip utility to combine all the files and directory structure into a single file. This will still take up time and resources on the system but it would make backups much quicker. There are better ideas, though.

 

NetBackup has a relatively new feature called Accelerator that might be able to use if you have NBU appliances or a vendor who supports this through OST. This is good for low change rate scenarios.

 

VADP would allow you to do a snapshot backup of the entire VM with NetBackup, this would be much faster. There are other requirements, but it definitely would work well.

 

Flashbackup may take just as long as it has to backup the entire raw device, including any blank space on the filesystem, but it would also work.

 

Another consideration would be for Synthetic Backups. You would still lose a lot of time as the filesystem reads the files to be backed up and compares any changes. This is good for low change rate scenarios.

 

I believe all of these will require a license to enable and they may require testing to see which one works best in your situation.

 

I hope that helps,

Rusty

 

On Sat, Sep 27, 2014 at 2:43 PM, Shaheensn <nbu-forum AT backupcentral DOT com> wrote:

Hi,

It would be extremely helpful if the expert minds in this forum could help me with below concerns related to backing up data using Symantec NetBackup.

Background

Our system has a folder containing millions of small files spread across 1000s of sub folders running into more than 500GB that needs to be backed up on a daily basis. The backup usually starts end of day and is ideally supposed to finish before users start accessing the system the following day. However, the current backup using Symantec Netbackup takes lot of time and hence, this impacts the system performance. I assume this is because of the current backup approach used by Symantec is based on file-level backup.

The current backup policy has a monthly full backup, weekly differential backup and daily incremental backup. However, the daily backup itself takes more than double the desired time. Symantec client version is 7.6.02.The server is a physical server having Windows Server 2003. The drive is dedicated to storing the files.

I am looking for a solution that would provide the fastest backup in order to ensure that backups are completed before users start accessing the system in the morning.

1. What is the best approach for backing up millions of small files spread across 1000s of folders using Symantec Netbackup?
2. I understand that flash backup feature in Symantec uses block-level backup and this would be ideal approach since there are lot of files involved?
3. What are the disadvantages of using Symantec Flashbackup feature?
4. Does installing Symantec SAN client without having fiber channel offer any improvement in backup speed? I read on some forums that you should use SAN client only if there is fiber channel support.
5. Would moving applications to Virtual Machines hosted on Vmware environment provide any improvement.

PS: Backups are not my domain of expertise and hence, I would like to apologize in advance if I have mentioned something absurd. Secondly, changing the application file structure is not an option :(

Cheers,

Shaheen

+----------------------------------------------------------------------
|This was sent by shaheen.nalakath AT gmail DOT com via Backup Central.
|Forward SPAM to abuse AT backupcentral DOT com.
+----------------------------------------------------------------------


_______________________________________________
Veritas-bu maillist  -  Veritas-bu AT mailman.eng.auburn DOT edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu



 

--

Rusty Major ▪ Manager - Compute Engineering, Backup and Storage ▪ Sungard Availability Services
757 N. Eldridge Pkwy, Suite 200, Houston, TX 77079 ▪  Office: 281-584-4693  ▪ Mobile: 713-724-4914 rusty.major AT sungardas DOT comwww.sungardas.com

     

CONFIDENTIALITY:  This e-mail (including any attachments) may contain confidential, proprietary and privileged information, and unauthorized disclosure or use is prohibited.  If you received this e-mail in error, please notify the sender and delete this e-mail from your system.




This e-mail was sent by GlaxoSmithKline Services Unlimited
(registered in England and Wales No. 1047315), which is a
member of the GlaxoSmithKline group of companies. The
registered address of GlaxoSmithKline Services Unlimited
is 980 Great West Road, Brentford, Middlesex TW8 9GS.
_______________________________________________
Veritas-bu maillist  -  Veritas-bu AT mailman.eng.auburn DOT edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu