Bacula-users

Re: [Bacula-users] estimating time remaining on a backup

2009-12-08 09:02:38
Subject: Re: [Bacula-users] estimating time remaining on a backup
From: Gavin McCullagh <gavin.mccullagh AT gcd DOT ie>
To: Alex Chekholko <chekh AT pcbi.upenn DOT edu>
Date: Tue, 8 Dec 2009 13:58:34 +0000
Hi,

On Mon, 07 Dec 2009, Alex Chekholko wrote:

> On Thu, 3 Dec 2009 13:46:02 +0000
> Gavin McCullagh <gavin.mccullagh AT gcd DOT ie> wrote:
> 
> > I started a full backup last night of a tired old Windows-based Dell NAS.
> > It's very slow.  The filesystem is full and super-fragmented.  I also have
> > compression turned on which makes the cpu work rather hard and slows things
> > down even further.  1.5MB/sec :-(
> 
> Doesn't that tell you all you need to know?  Knowing this rate: 1.5MB/s
> and the total amount of data in the full backup will tell you the total
> time, no?

Yes, but the trouble is that once it's running, if I haven't previously run
an estimate and noted it down, I don't know the full amount of data.  I can
run an estimate at the same time as the backup is running but that would be
slow to do and would delay the backup even further.  I could go to the
machine itself and ask it about the contents of all of those directories
and that would have a similar slowing effect.

To complicate matters, if you have compression on, the client tells you the
compressed amount of data sent (I think?), so you'd need to know the
compression ratio to work out what portion was done.  If you're not running
a full backup, things are even more complex.

If I schedule a backup for 3am and come in in the morning to find it's
still going, it's pretty tough to work out how much longer it's going to
be.

I was wondering if Bacula might be in a position to estimate or help the
user estimate the time remaining based on what it already knows?  I'm not
sure if it does any sort of internal estimate at the start or if it just
starts traversing the filesystem sending what it needs to send.  

Even to be told 
        600GB of 1TB (estimated) done.
or
        1400 of 2000 files done.

would be a help.  Obviously you can't assume every file is the same size
but it gives you some idea.  

Perhaps this is just not possible or would come with a cost (in which case
it could perhaps be optional?).

Gavin


------------------------------------------------------------------------------
Return on Information:
Google Enterprise Search pays you back
Get the facts.
http://p.sf.net/sfu/google-dev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>