BackupPC-users

Re: [BackupPC-users] Problems with hardlink-based backups...

2009-08-31 18:25:36
Subject: Re: [BackupPC-users] Problems with hardlink-based backups...
From: Les Mikesell <lesmikesell AT gmail DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Mon, 31 Aug 2009 17:22:27 -0500
Jeffrey J. Kosowsky wrote:
>
>  > > I have seen problems where the attrib files are not synchronized with
>  > > the backups or when the pc tree is broken. In fact, that is the reason
>  > > I wrote several of my routines to identify and fix such problems. Now
>  > > true, the cause is typically due to crashes or disk/filesystem issues
>  > > outside of the direct scope of BackupPC but there are real-world
>  > > synchronization and integrity issues that can arise.
>  > 
>  > But nothing you've proposed will make any difference in this respect. 
>  > Well, maybe different, but nothing to enforce additional synchronization 
>  > with the file content.
> 
> Except that it is easier to back up a database than to back up
> thousands if not millions of scattered attrib files.

Is it?  I think rsync or tar would handle them rather easily and toss 
them on about any media without regard to having a matching database 
application running.  Why is it OK require a sql database application 
but not something like zfs?

> Also, there are
> well-known tools for checking database consistency while you need to
> write custom ones for attrib files. 

I've found them to be as reliable as the underlying filesystem.  Perhaps 
that is your real problem.

>  > > Indeed, there is no
>  > > question in my mind that a single well-constructed relational database
>  > > would be orders of magnitude faster here.
>  > 
>  > Until you go to get the data, which is kind of the point.
> 
> I can only tell you how slow and non-optimized the current
> implementation is. Do you really believe that a relational database
> wouldn't be significantly faster than the current approach
> finding/opening/reading/decompressing/parsing multiple layers of
> attrib files?

It could be, if it were optimized for that.  But updating them might be 
slower, finding the records that need to expire and releasing their 
space might be a lot slower, and the usual case of getting both the 
attributes and the data at the same time would probably be the same or 
worse.

-- 
   Les Mikesell
     lesmikesell AT gmail DOT com


------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>