Re: [Bacula-users] incremental backups too large
2011-01-14 15:21:34
>>>>> On Fri, 14 Jan 2011 09:23:37 +0000, Bart Swedrowski said:
>
> 2011/1/13 Mark <bacula-list AT nerdish DOT us>:
> > Have you done a 'list files jobid=<some job>' for one of your incrementals?
> > Maybe you have a few really large files that are getting changed every day,
> > and therefore getting backed up each day.
>
> Yeah, I tried that, too. It's only listing files that got changed/are
> new and should be backed up. Although, total is still very high.
>
> +-------+--------+---------------------+------+-------+----------+----------------+-----------+
> | jobid | name | starttime | type | level | jobfiles |
> jobbytes | jobstatus |
> +-------+--------+---------------------+------+-------+----------+----------------+-----------+
> | 1,105 | tic FS | 2011-01-10 02:05:08 | B | I | 4,585 |
> 39,502,153,253 | T |
> +-------+--------+---------------------+------+-------+----------+----------------+-----------+
>
> Again, even though it's showing the jobbytes = over 39GB on disk it's
> taking max 6GB. I can see how much space individual volumes are
> taking.
It sounds like you have some large files which compress a lot.
I would take the output of 'list files jobid=<some job>' and write a script to
find the size of every file in the list to verify why it reaches 39GB.
__Martin
------------------------------------------------------------------------------
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand
malware threats, the impact they can have on your business, and how you
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|
|
|