Bacula-users

Re: [Bacula-users] Finding largest file(s) backed up for a specific job

2011-03-31 22:30:33
Subject: Re: [Bacula-users] Finding largest file(s) backed up for a specific job
From: "John Stoffel" <john AT stoffel DOT org>
To: "John Stoffel" <john AT stoffel DOT org>
Date: Thu, 31 Mar 2011 22:30:14 -0400
John> I've been happily running bacula at home, and usualy it's pretty
John> predictable about the size of data backed up each night, but
John> last night I had an incremental run for a specfic client which
John> used 8Gb of data, when I normally expect around 500mb or so.

John> Is there an easy mysql query I can use to:

John> a) find the largest file(s) backed up for a particular jobid?

Ok, I've googled, read back in the archives about the File.LStat field
and how the file size is in a strange Base64 (almost) encodiing which
needs to be hacked on to get out the sizes.

Looks like I'll just need to:

1. write a perl script to read in a jobid, then pull out the files
from that job.
2. For each file, pull out the File.LStat field and decode the info.
3. And finally then I can search for the largest file(s).  

I'll post something when I'm done.  I'd prefer to do it in mysql
directly, but it looks too funky to parse the string properly without
writing really crazy (to me!) SQL procedures or cursors.  Not worth
it.

Thanks,
John

------------------------------------------------------------------------------
Create and publish websites with WebMatrix
Use the most popular FREE web apps or write code yourself; 
WebMatrix provides all the features you need to develop and 
publish your website. http://p.sf.net/sfu/ms-webmatrix-sf
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>