Bacula-users

Re: [Bacula-users] bacula saying volume limit hit on tape drive

2013-04-09 13:53:50
Subject: Re: [Bacula-users] bacula saying volume limit hit on tape drive
From: Jonathan Horne <jhorne AT skopos DOT us>
To: Jonathan Horne <jhorne AT skopos DOT us>, "bacula-users AT lists.sourceforge DOT net" <bacula-users AT lists.sourceforge DOT net>
Date: Tue, 9 Apr 2013 17:30:11 +0000

Yay, I figured it out.  I added this to the pool that the autoloader lives in:

 

Pool {

  Name             = TapeArchiveFullBackups

  Pool Type        = Backup

  Recycle          = yes

  AutoPrune        = yes

  Volume Retention = 3 years

  Storage          = PV-TL4000

  Maximum Volume Bytes = 0

}

 

I did not previously have a max volume bytes specified, so it must have been inheriting it from somewhere.  But now I have restarted my copy job, and its at 188GB now, and still rolling along.

 

Thanks,

jonathan

 

From: Jonathan Horne [mailto:jhorne AT skopos DOT us]
Sent: Tuesday, April 09, 2013 11:05 AM
To: bacula-users AT lists.sourceforge DOT net
Subject: [Bacula-users] bacula saying volume limit hit on tape drive

 

Hello, bacula is writing just 53GB to 400GB LTO3 tapes, declaring that the specified size limit has been exceeded.

 

09-Apr 10:51 bacula-sd JobId 11602: User defined maximum volume capacity 53,687,091,200 exceeded on device "TL4000D2" (/dev/nst1).

09-Apr 10:51 bacula-sd JobId 11602: Re-read of last block succeeded.

09-Apr 10:51 bacula-sd JobId 11602: End of medium on Volume "SFG019L3" Bytes=53,687,079,936 Blocks=832,202 at 09-Apr-2013 10:51.

09-Apr 10:51 bacula-sd JobId 11602: 3307 Issuing autochanger "unload slot 8, drive 1" command.

09-Apr 10:52 bacula-sd JobId 11602: 3301 Issuing autochanger "loaded? drive 1" command.

09-Apr 10:52 bacula-sd JobId 11602: 3302 Autochanger "loaded? drive 1", result: nothing loaded.

 

As far as I can tell, I do not have a size limit defined for this device.

 

Device {

  Name = TL4000D2

  Drive Index = 1

  Media Type = LTO-3

  Archive Device = /dev/nst1

  Changer Command = "/usr/libexec/bacula/mtx-changer %c %o %S %a %d"

  Changer Device = /dev/sg6

  Autochanger = yes

  AutomaticMount = yes;

  AlwaysOpen = yes;

  RemovableMedia = yes

  RandomAccess = no

  RequiresMount = yes

  LabelMedia = no

# Enable the Alert command only if you have the mtx package loaded

# If you have smartctl, enable this, it has more info than tapeinfo

  Alert Command = "sh -c 'smartctl -H -l error %c'"

}

 

However, I do have limit defined on the file storage files that the backups from this copy job are coming from.

 

What can I look at to eliminate this waste of 350GB of unused tape space?

 

Thanks,

jonathan

 

 


This is a PRIVATE message. If you are not the intended recipient, please delete without copying and kindly advise us by e-mail of the mistake in delivery. NOTE: Regardless of content, this e-mail shall not operate to bind SKOPOS to any order or other contract unless pursuant to explicit written agreement or government initiative expressly permitting the use of e-mail for such purpose.



This is a PRIVATE message. If you are not the intended recipient, please delete without copying and kindly advise us by e-mail of the mistake in delivery. NOTE: Regardless of content, this e-mail shall not operate to bind SKOPOS to any order or other contract unless pursuant to explicit written agreement or government initiative expressly permitting the use of e-mail for such purpose.
------------------------------------------------------------------------------
Precog is a next-generation analytics platform capable of advanced
analytics on semi-structured data. The platform includes APIs for building
apps and a phenomenal toolset for data science. Developers can use
our toolset for easy data analysis & visualization. Get a free account!
http://www2.precog.com/precogplatform/slashdotnewsletter
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
<Prev in Thread] Current Thread [Next in Thread>