Bacula-users

Re: [Bacula-users] Find size of file in Job

2008-06-23 18:34:56
Subject: Re: [Bacula-users] Find size of file in Job
From: mark.bergman AT uphs.upenn DOT edu
To: bacula-users AT lists.sourceforge DOT net
Date: Mon, 23 Jun 2008 18:34:45 -0400

In the message dated: Mon, 23 Jun 2008 09:36:51 CDT,
The pithy ruminations from C M Reinehr on 
<Re: [Bacula-users] Find size of file in Job> were:
=> T24gTW9uIDIzIEp1bmUgMjAwOCAwMjo1OSwgQW5uZXR0ZSBKw6RrZWwgd3JvdGU6Cj4gQW0gMjEu
=> ZS5uZXQvbGlzdHMvbGlzdGluZm8vYmFjdWxhLXVzZXJzCg==

        [SNIP!]

Yes, that's how your MIME-encoded message looks.

=> 

---------------------------------------------------------------

Your message, as decodeded reads:


=> On Mon 23 June 2008 02:59, Annette Jäkel wrote:
=> > Am 21.06.2008 19:17 Uhr schrieb "C M Reinehr" unter <cmr AT amsent DOT 
com>:
=> > > James,
=> > >
=> > > On Fri 20 June 2008 21:49, James Austin wrote:

        [SNIP!]

=> > >>
=> > >> 1. Using bconsole, how do you get a list of files and their sizes stored
=> > >> in an incremental backup, I can find the list of files using the query
=> > >> command but I cannot find any way for displaying the size of the files.

        [SNIP!]

=> 
=> You're correct--my bad. I chose the catalog backup as a sample job which, as 
=> it happens, only contains one file! After taking a quick look at the make 
=> tables script for my version, it doesn't appear that the size of the 
=> individual files backed up is stored in the catalog.

Actually, I believe that the size of each file is stored in the catalog. The 
problem is that the information is stored as binary lstat data (base64 encoded).

In fact, Arno Lehmann documented this on 23 Oct 2007 (in a post in the thread 
entitled "File size from backup needed. Where?").

On 28 Nov 2007, "Jay" posted a python script to the list that will decode some 
fo the values in the File.LStat field, so that you can get the information you 
wanted. The post was titled "Filesize in database".

The list archives are a really good place to search before posting...

         http://sourceforge.net/mailarchive/forum.php?forum_name=bacula-users


=> 
=> > >> 2. Are there any recommendations for handling the /var/log folder,
=> > >> something along the lines of truncating the files so that only the last
=> > >> 500k, or only the last 100000 lines are copied?

I include the bacula logs in the logrotate configuration, so that the nightly 
logrotate process handles bacula logs along with all the other system log files.


        [SNIP!]

=> > >> James.
=> > >
=> > > cmr

        [SNIP!]

=> > Annette
=> 
=> cmr
=> 
=> -- 
=> Debian 'Etch' - Registered Linux User #241964
=> --------
=> "More laws, less justice." -- Marcus Tullius Ciceroca, 42 BC

----
Mark Bergman                              voice: 215-662-7310
mark.bergman AT uphs.upenn DOT edu                 fax: 215-614-0266
System Administrator     Section of Biomedical Image Analysis
Department of Radiology            University of Pennsylvania
      PGP Key: https://www.rad.upenn.edu/sbia/bergman 




The information contained in this e-mail message is intended only for the 
personal and confidential use of the recipient(s) named above. If the reader of 
this message is not the intended recipient or an agent responsible for 
delivering it to the intended recipient, you are hereby notified that you have 
received this document in error and that any review, dissemination, 
distribution, or copying of this message is strictly prohibited. If you have 
received this communication in error, please notify us immediately by e-mail, 
and delete the original message.

-------------------------------------------------------------------------
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>