Re: [Bacula-users] Tuning for large (millions of files) backups?
2010-11-11 06:59:08
Henrik Johansen wrote:
> I have had about as much of this as I can take now so please, stop spreading
> FUD about MySQL.
Have you used Mysql with datasets in excess of 100-200 million objects?
I have. Our current database holds about 400 million File table entries.
MySQL requires significant tuning and kernel tweakery, plus uses a lot
more memory than postgres does for the same dataset.
For Bacula users, it's a lot _easier_ to use Postgres on a large
installation than it is to use MySQL.
I held off switching to Postgres for a long time because I was
unfamiliar with it, however having done so I'm glad that I did - it's
required virtually zero tweaking since it was set up and runs
approximately twice as fast as MySQL did, with a ram footprint about
half the size of MySQL's.
Small datasets are fine with MySQL and will probably work better. Ours
was brilliant up to about 50 million entries and then required tuning.
This discussion is about appropriate tools for the job.
If you wish to usefully contribute to the thread then provide some
assistance to the OP regarding tuning his MySQL for optimum performance.
------------------------------------------------------------------------------
Centralized Desktop Delivery: Dell and VMware Reference Architecture
Simplifying enterprise desktop deployment and management using
Dell EqualLogic storage and VMware View: A highly scalable, end-to-end
client virtualization framework. Read more!
http://p.sf.net/sfu/dell-eql-dev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|
|
|