Re: [Bacula-users] 11TB backup run out of memory and dies
2010-06-09 11:49:06
Am Wed, 09 Jun 2010 11:29:37 -0400 schrieb ekke85:
> Hi
>
> I have a Scalar i500 library with 5 drives and 135 slots. I have a Red
> Hat 5 server with a 1gb nic. The setup works fine for backups on most
> systems. The problem I have is a NFS share with 11TB of data that I need
> to backup. Every time I run this job it will write about 600GB of data
> to a tape and then the server runs out of memory and remounts the root
> filesystem as read only. Has anyone else had this before? Or what can I
> try to fix this?
hmm... why does it remount / on "out of memory"?
what bacula version do you use?
what's your bacula configuration?
does it stop on transferring data or on inserting meta-data to db?
which process is eating your memory away?
a workaround could be to split up the large fileset into smaller ones.
i'm running incremental backups of 300-400gb without memory problems on
bacula 5.0.2.
- Thomas
------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit. See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|
|
|