Re: [Bacula-users] backing up to Amazon S3
2010-05-14 17:52:26
On 05/14/2010 02:02, Denis Shaposhnikov wrote:
> Does anybody has working examples of using s3fs or s3cmd with bacula for
> using Amazon S3 as bacula storage?
>
> I've tried s3fs but it seems bacula 5.0.0 doesn't want to use s3 bucket
> mounted into a directory because it doesn't have "." and ".." entries.
Hmm... No answers. OK, I've ended with mounting S3 bucket on system boot
like
echo "/usr/local/sbin/s3fs -o use_cache=/home/bacula/s3fs-tmp bucket
/var/db/bacula/s3fs" | su -m bacula
Don't forget "use_cache". I've found bacula can't label new volumes
without it. It's trying to label it forever without any luck. Also I've
added
Run After Job = "/usr/bin/find /home/bacula/s3fs-tmp -type f -exec
/bin/rm -f {} +"
Run After Failed Job = "/usr/bin/find /home/bacula/s3fs-tmp -type f
-exec /bin/rm -f {} +"
into my JobDefs. Because I don't want keep volumes on local file system.
That's why I use S3. Ideally would be removing cached volume just after
bacula close it, but I don't know how to implement it.
And last, I've patched s3fs using a little modified diff from
http://code.google.com/p/s3fs/wiki/FuseOverAmazon
Comment by estabroo, May 10 (4 days ago)
I'm attaching that diff into this mail.
patch-s3fs.cpp
Description: Text document
------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|
|
|