On Wed, 31 Aug 2005, Steve Wray wrote:
> Geert Uytterhoeven wrote:
> > On Tue, 30 Aug 2005, Graeme Humphries wrote:
> >
> >>Guy Dallaire wrote:
> >>
> >>>Yes, thanks. I know about hard links. But how would it impact the size
> >>>or performance of my backups ?
> >>>
> >>
> >>Well, if a file is hard linked multiple times, it'll be backed up multiple
> >>times. Therefor, a filesystem with tons of hard links will take a really
> >>long
> >>time to back up. :)
> >
> > Fortunately tar is sufficiently smart to back it up only once.
> >
> > Usually the problem with lots of hard links is not the data timeout value,
> > but
> > the estimate timeout value, as I found out the hard way[*].
>
> We've been having similar problems with estimates timeing out. I just
> ran the 'find' command given in an earlier email and found a grand total
> of 607 hard links on the entire filesystem.
>
> What I'm wondering is, does 607 count as 'lots' WRT amanda estimate
> timeouts?
Not really, given I have many files with more than 600 hard links.
I seem to have 1582186 of them in my cluster of Linux kernel source trees.
Gr{oetje,eeting}s,
Geert
--
Geert Uytterhoeven -- There's lots of Linux beyond ia32 -- geert AT linux-m68k
DOT org
In personal conversations with technical people, I call myself a hacker. But
when I'm talking to journalists I just say "programmer" or something like that.
-- Linus Torvalds
|