If life gives you lemons, keep them-- because hey.. free lemons.
--- On Tue, 5/25/10, John Drescher <drescherjm AT gmail DOT com> wrote:
> From: John Drescher <drescherjm AT gmail DOT com>
> Subject: Re: [Bacula-users] bacula using volume for each job every night
> To: "Joseph Spenner" <joseph85750 AT yahoo DOT com>
> Cc: bacula-users AT lists.sourceforge DOT net
> Date: Tuesday, May 25, 2010, 9:22 AM
> On Tue, May 25, 2010 at 11:06 AM,
> Joseph Spenner <joseph85750 AT yahoo DOT com>
> wrote:
> >
> > If life gives you lemons, keep them-- because hey..
> free lemons.
> >
> >
> > --- On Tue, 5/25/10, Phil Stracchino <alaric AT metrocast DOT net>
> wrote:
> >
> >> From: Phil Stracchino <alaric AT metrocast DOT net>
> >> Subject: Re: [Bacula-users] bacula using volume
> for each job every night
> >> To: bacula-users AT lists.sourceforge DOT net
> >> Date: Tuesday, May 25, 2010, 8:49 AM
> >> On 05/25/10 10:37, Joseph Spenner
> >> wrote:
> >> > The last few nights, bacula is using multiple
> volumes
> >> again. I had this problem once before and
> thought I
> >> fixed it. But apparently not. Here are my
> >> settings:
> >> >
> >> > # Default pool definition
> >> > Pool {
> >> > Name = Default
> >> > Pool Type = Backup
> >> > Recycle = yes
> >> > AutoPrune = yes
> >> > Job Retention = 20 days
> >> > Volume Retention = 20 days
> >> > }
> >> >
> >> > # File Pool definition
> >> > Pool {
> >> > Name = File
> >> > Pool Type = Backup
> >> > Recycle = yes
> >> > AutoPrune = yes
> >> > Job Retention = 20 days
> >> > Volume Retention = 20 days
> >> > Volume Use Duration = 23h
> >> > Maximum Volume Bytes = 1330G
> >> > Maximum Volumes = 21
> >> > }
> >> >
> >> > I have several client definitions such as:
> >> >
> >> > JobDefs {
> >> > Name = "TychoJob"
> >> > Type = Backup
> >> > Level = Incremental
> >> > FileSet = "Full Set"
> >> > Schedule = "WeeklyCycleTuesday"
> >> > Storage = File
> >> > Messages = Standard
> >> > Pool = File
> >> > Priority = 10
> >> > Write Bootstrap =
> >> "/opt/bacula/bin/working/%c.bsr"
> >> > }
> >> >
> >> >
> >> > Client {
> >> > Name = tycho-fd
> >> > Address = tycho
> >> > FDPort = 9102
> >> > Catalog = MyCatalog
> >> > Password = "sensoredPassword"
> >> > AutoPrune = yes
> >> > }
> >> >
> >> > Schedule {
> >> > Name = "WeeklyCycleTuesday"
> >> > Run = Level=Full tue at 1:05
> >> > Run = Incremental wed-mon at 1:05
> >> > }
> >> >
> >> > Anyone have any ideas?
> >>
> >> This may be a silly question, but ... how much
> data is
> >> being backed up
> >> each night?
> >>
> >> How many clients do you have and what are your
> various
> >> concurrency
> >> settings? In particular, does every storage
> device
> >> have a concurrency
> >> setting greater than the number of clients?
> >>
> >>
> >
> > Currently, not much data backed up at all. Last
> night, 6 volumes/files were used:
> >
> > file_0009 = 292B Incremental of the bacula
> server itself
> > file_0010 = 284.34G Full backup of a Linux
> Client
> > file_0011 = 5.47MB Incremental of a Linux
> Client
> > file_0012 = 67.28MB Incremental of a Linux
> Client
> > file_0013 = 169.85MB Incremental of a Windows
> Client
> > file_0014 = 44.96MB Catalog
> >
> > I currently have 5 clients.
> >
>
> How about the output of
>
> list media pool=File
>
> John
>
John:
Here's that output:
*list media pool=File
+---------+------------+-----------+---------+-----------------+----------+--------------+---------+------+-----------+-----------+---------------------+
| MediaId | VolumeName | VolStatus | Enabled | VolBytes | VolFiles |
VolRetention | Recycle | Slot | InChanger | MediaType | LastWritten |
+---------+------------+-----------+---------+-----------------+----------+--------------+---------+------+-----------+-----------+---------------------+
| 1 | file_0001 | Recycle | 1 | 11,680 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-20 14:45:06 |
| 2 | file_0002 | Recycle | 1 | 9,253,254,931 | 2 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-20 14:50:48 |
| 3 | file_0003 | Used | 1 | 527,479,983 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:05:14 |
| 4 | file_0004 | Used | 1 | 270,915,092 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:06:02 |
| 5 | file_0005 | Used | 1 | 4,433,256 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:06:06 |
| 6 | file_0006 | Used | 1 | 802 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:06:08 |
| 7 | file_0007 | Used | 1 | 82,725,180 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:06:58 |
| 8 | file_0008 | Used | 1 | 36,517,484 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-24 01:10:04 |
| 9 | file_0009 | Used | 1 | 1,122 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 01:05:09 |
| 10 | file_0010 | Used | 1 | 297,339,853,723 | 69 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 08:38:27 |
| 11 | file_0011 | Used | 1 | 5,636,837 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 08:38:37 |
| 12 | file_0012 | Used | 1 | 68,951,208 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 08:38:40 |
| 13 | file_0013 | Used | 1 | 174,055,489 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 08:39:38 |
| 14 | file_0014 | Used | 1 | 46,074,125 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-25 08:39:42 |
| 15 | file_0015 | Recycle | 1 | 5,408,772,092 | 1 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-17 14:02:57 |
| 16 | file_0016 | Recycle | 1 | 85,692,487 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-17 22:02:59 |
| 17 | file_0017 | Recycle | 1 | 5,479,213,260 | 1 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-18 14:58:46 |
| 18 | file_0018 | Recycle | 1 | 7,391,498,443 | 1 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-19 01:15:46 |
| 19 | file_0019 | Used | 1 | 301,801,642,606 | 70 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-21 08:11:25 |
| 20 | file_0020 | Used | 1 | 1,061,346,999 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-22 01:10:07 |
| 21 | file_0021 | Used | 1 | 478,660,455 | 0 |
1,728,000 | 1 | 0 | 0 | File | 2010-05-23 01:10:04 |
+---------+------------+-----------+---------+-----------------+----------+--------------+---------+------+-----------+-----------+---------------------+
*
------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|