Re: [BackupPC-users] 8.030.000, Too much files to backup ?
2011-12-19 23:33:44
sorry to take long to reply.
yes it saves me a lot of time, let me explain.
although I have a fast san and servers the time for fetching lots of small files is high, the max bandwidth i could get was about 5MB/s, increasing concurrecy i can get about 20-40MB/s depending on what im backingup at the moment. this way i can get more out of the san and backup server. if i increase concurrency even more i can reach higher performance but i don't want to steal all io available for backuppc, but to be sincere I don't need it anyway as i get really good performance this way.
This setup is running on a large finantional group and it outperforms very expensive (and complex) proprietary solutions.
the backuppc should have a fairly amount of ram, cpu, and isn't virtualized. in my case a 4 core server and 8GB ram (although it swaps a bit), i'm also using ssh+ rsync which add some overhead but not critical in any way.
cheers
pedro
Sent from my galaxy nexus.
www.linux-geex. com
On Dec 19, 2011 6:05 PM, "Jean Spirat" < jean.spirat AT squirk DOT org> wrote:
Le 18/12/2011 20:44, Pedro M. S. Oliveira a écrit :
you may try to use a rsyncd directly on the server. This may speed up things.
another thing is to split the large backup into several smaller ones. I've an email cluster with 8TB and millions of small files (I'm using dovecot), theres also a san involved. in order to use all the bandwidth available I configured backup to run from username starting in a to e, f to j and so on, then they all run at the same time. incremental take about 1 hour and full about 5.
cheers
pedro
I directly mount the nfs share on the backuppc server so no need for rsyncd here this is like local backup with the NFS overhead of course.
Do you won a lot from splitting instead of doing just one big backup ? At least you seems to have the same kind of file numbers i have.
regards,
Jean.
------------------------------------------------------------------------------
Write once. Port to many.
Get the SDK and tools to simplify cross-platform app development. Create
new or port existing apps to sell to consumers worldwide. Explore the
Intel AppUpSM program developer opportunity. appdeveloper.intel.com/join
http://p.sf.net/sfu/intel-appdev _______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|
<Prev in Thread] |
Current Thread |
[Next in Thread> |
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, (continued)
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, Jean Spirat
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, Les Mikesell
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, gagablubber AT vollbio DOT de
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, Timothy J Massey
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, gagablubber
- Re: [BackupPC-users] 8.030.000, Too much files to backup ?, Les Mikesell
Re: [BackupPC-users] 8.030.000, Too much files to backup ?, Pedro M. S. Oliveira
|
|
|