Search String: Display: Description: Sort:

Results:

References: [ +subject:/^(?:^\s*(re|sv|fwd|fw)[\[\]\d]*[:>-]+\s*)*\[Bacula\-users\]\s+Best\s+compression\s+method\s+for\s+90TB\+\s+of\s+data\s+per\s+week\s*$/: 2 ]

Total 2 documents matching your query.

1. [Bacula-users] Best compression method for 90TB+ of data per week (score: 1)
Author: "Boutin, Stephen" <Stephen.Boutin AT lightningsource DOT com>
Date: Wed, 7 Mar 2012 10:49:38 -0600
Hi all, I am currently running Bacula v5.0.3 in a Production environment. I have about 25TB worth of data being backed up by it for a weeks&#8217; worth of Full & Incremental backups. I need to migra
/usr/local/webapp/mharc-adsm.org/html/Bacula-users/2012-03/msg00104.html (13,045 bytes)

2. Re: [Bacula-users] Best compression method for 90TB+ of data per week (score: 1)
Author: Laurent Papier <bacula AT tuxfan DOT net>
Date: Wed, 7 Mar 2012 22:28:18 +0100
lume size) to 15GB for my data pool of the smaller backups & 150GB for my data pool of the larger ones. Please let me know your thoughts & ideas on how to implement a solution for this. Thanks! Hi, t
/usr/local/webapp/mharc-adsm.org/html/Bacula-users/2012-03/msg00115.html (13,512 bytes)


This search system is powered by Namazu