Thanks Ana!
Something such as
SELECT ClientId, SUM(JobFiles) AS NB FROM Job GROUP BY ClientId ORDER BY
NB DESC;
should also do the trick a bit more faster ;-)
Best Regards,
Eric
Le 07. 10. 15 15:23, Ana Emília M. Arruda a écrit :
> Hello Stephen,
>
> On Mon, Oct 5, 2015 at 2:17 PM, Stephen Thompson
> <stephen AT seismo.berkeley DOT edu <mailto:stephen AT seismo.berkeley DOT
> edu>> wrote:
>
>
> Regarding:
> > Would be nice also if you can give the number of Filename per Client
> (from the job table).
>
> Do you have a sample SQL to retrieve this stat?
>
>
> select Client.Name, count(distinct Filename.FilenameId) from Client,
> Filename, File, Job where Filename.FilenameId=File.FilenameId and
> File.JobId=Job.JobId and Job.ClientId=Client.ClientId group by
> Client.ClientId;
>
> The above query should work.
>
> Best regards,
> Ana
>
>
>
> thanks,
> Stephen
>
>
>
>
>
>
>
> On 10/03/2015 12:02 AM, Eric Bollengier wrote:
> > Hello Stephen,
> >
> > On 10/03/2015 12:00 AM, Stephen Thompson wrote:
> >>
> >>
> >> All,
> >>
> >> I believe I'm having mysql database issues since upgrading to
> 7.2 (from
> >> 7.0.2). I run mysql innodb with 900Gb database that's largely
> the File
> >> table.
> >
> > For large catalog, we usually advise to use PostgreSQL where we have
> > multi-terabytes databases in production.
> >
> >> Since upgrading, I lose a few jobs a night due to database locking
> >> timeouts, which I have set to 3600. I also log slow queries.
> >
> > Can you get some information about these locks? On which table?
> Can you
> > give some statistics on your catalog like the size and the number of
> > records of the File, Filename and Path table? Would be nice also
> if you
> > can give the number of Filename per Client (from the job table).
> >
> > You might have many orphan Filenames, and MySQL is not always
> very good
> > to join large tables (it uses nested loops, and cannot use the
> index on
> > the Text column in all queries).
> >
> >> It appears that typically during a months I have about 90-100
> queries
> >> that take longer than 15 minutes to run. Already this month
> (upgraded
> >> earlier this week), I have 32 queries that take longer than 15
> minutes.
> >> At this rate (after 2 days) that will up my regular average
> of 90-100
> >> to 480!
> >>
> >> Something is wrong and the coincidence is pretty strong that it's
> >> related to the upgrade.
> >
> > Maybe, but I'm not sure, we did not change a lot of thing in this
> area,
> > we did mostly refactoring.
> >
> > Best Regards,
> > Eric
> >
>
> --
> Stephen Thompson Berkeley Seismological Laboratory
> stephen AT seismo.berkeley DOT edu <mailto:stephen AT seismo.berkeley DOT
> edu>
> 215 McCone Hall # 4760
> Office: 510.664.9177 <tel:510.664.9177> University of
> California, Berkeley
> Remote: 510.214.6506 <tel:510.214.6506> (Tue,Wed) Berkeley, CA
> 94720-4760
>
>
> ------------------------------------------------------------------------------
> _______________________________________________
> Bacula-users mailing list
> Bacula-users AT lists.sourceforge DOT net
> <mailto:Bacula-users AT lists.sourceforge DOT net>
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>
>
------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|