BackupPC-users

Re: [BackupPC-users] Looking at BackupPC notation clarification

2010-04-12 18:59:15
Subject: Re: [BackupPC-users] Looking at BackupPC notation clarification
From: Ted Hilts <ehilts AT mcsnet DOT ca>
To: backuppc-users AT lists.sourceforge DOT net
Date: Mon, 12 Apr 2010 16:56:54 -0600
On Sun, 2010-04-11 at 02:19 +0100, Luis Paulo wrote:
> Hi, Ted
> 
> 
>         Luis Paulo
>         
>         After a bit it became evident that I needed more authority so
>         I ended up
>         by:
>         Opened a bash session terminal.
>         then
>         sudo htpasswd /etc/backuppc/htpasswd backuppc
>         then
>         I first was prompted for my password because of using sudo
>         then
>         I was prompted for new password (twice) after which I got the
>         response:
>         Updating passwd for user backuppc
>         then
>         I opened Firefox browser and entered:
>         http://192.168.1.16/backuppc/
>         
>         then (I think) I got the request for user and password which I
>         provided
>         after which the web gui came up on Firefox with the following
>         information:
>         
>         
> Well done. Relax, you're up and running.
> 
> Now go back to basics. Start by configuring clients. Use the GUI, left
> menu -> Edit Config and Edit Hosts (or edit the files
> at /etc/backuppc), go back to the documentation.
>  
>         General Server Information
>              * The servers PID is 6368, on host Ubuntu, version 3.0.0,
>         started
>                at 4/9 15:56.
>              * This status was generated at 4/10 17:10.
>              * The configuration was last loaded at 4/9 15:56.
>              * PCs will be next queued at 4/10 18:00.
>              * Other info:
>                      * 0 pending backup requests from last scheduled
>         wakeup,
>                      * 0 pending user backup requests,
>                      * 0 pending command requests,
>                      * Pool is 0.03GB comprising 3355 files and 3267
>                        directories (as of 4/10 01:00),
>                      * Pool hashing gives 0 repeated files with
>         longest chain
>                        0,
>                      * Nightly cleanup removed 0 files of size 0.00GB
>         (around
>                        4/10 01:00),
>                      * Pool file system was recently at 85% (4/10
>         17:06),
>                        today's max is 85% (4/10 01:00) and yesterday's
>         max was
>                        85%.
> 
> "Pool is 0.03GB" and "Pool file system was recently at 85%". I Think
> you may have little space. Pool (the backup files) are stored
> at /var/lib/backuppc/ (and below)
> 
> Everything else is expected.
> 
> 
>         Currently Running Jobs
>         
>         Host
>         Type
>         User
>         Start
>         Time
>         Command
>           PID
>         Xfer PID
>         
>         Failures that need attention
>          Host
>          Type
>          User
>         Last Try
>          Details
>          Error
>          Time
>         Last error
>         (other
>         than no
>         ping)
>         
>         Does the Pool have a copy of all my file fro "/"?
>         
> 
> Can't really say. Check Host Summary. 
>  
>         Do I do this same thing for the next machine or can I do the
>         other
>         machines remotely from 192.168.1.16? Or, do I want to. I don't
>         want
>         the backup on the same machine as being backed up and that
>         looks like
>         that may be the case?
>         
> 
> One machine, 192.168.1.16, is the server and stores all the backups
> from all the clients (hosts).
> With a bit of configuration, the server will automatically start
> backups for all your machines.
> Of course, you can have a 2nd server backing this one. Right?
>  
>         Are the files in the backup now compressed?
>         
> 
> Edit Config ->Backup Settings -> CompressLevel
>  
>         How would I do a restore if tomorrow 192.168.1.16 died?
>         
> 
> Small steps... ok? First a working backup machine. Any way, you would
> be able to restore file by file, or make a zip or tar file. You would
> be able to restore to the same machine or to another.
>  
>         Should I have a dedicated backup and restore machine and if
>         yes can I
>         use an older machine (and I mean old)?
>         
> 
> It depends. Really. That depends, for example, of what method will you
> be using (rsync, samba, tar), how many clients,...
> I use my oldest machine, but it is a Athlon 64, 3MB ram, with
> lvm/mdadm raid 1 (150GB now, pool at 52%, compression level 3),
> backing up 4 linux rsync clients, 2 smb virtual machines.
>  
>         How do my questions come close to the current defaults?
>         
>         Sorry for all the questions and based on your response there
>         will be
>         many more questions.
>         
> 
> and I'll do my best to answer them. 
> 
> Good work. Small steps. It can be a bit hard to set (not really), but
> it has almost no maintenance.
> Luis
>  
>         Thanks -- Ted Hilts
>         PS: I probably won't get back to my email till later tonight
>         or  later
>         but I will try to take a look at your responses.
>         Thanks again. Ted
>         
>         
>         
> 
Luis Paulo:

Yes, I agree that I next need to deal with the Clients issue.  But I
still have a concern having BackupPC plus the Server plus the Pool all
on the same machine 192.168.1.16 having the name "Ubuntu".

Maybe it would better help you help me if I explained a few things.

May application involves the collection of web pages associated with the
news.  I am talking about thousands of web pages every week.  Once a web
page (or web tree) is collected it is static forever -- that is to say
it's contents and appearance will never change.  However, I and others
will need to access these web pages by means of a browser.  This next
part is very important.  I cannot leave these collections of web pages
on dynamic media for several reasons. 

Reason #1 is the collection process itself.  I use Firefox with add-ons
which makes it possible to collect and export these collections.  So
that there is an operation occurring someplace under root "/" and
accumulates these collections (my data) while still located under "/"
for Linux and C:\... for Windows MS. (BTW I am trying to phase out all
MS Windows machines to Linux machines) but the process of change is slow
and there are still MS OS XP machines involved). The main thing I am
here trying to say is that there is data build up right inside the
processing scheme and if I don't regularly export and move this data out
to another machine or disk then "/" would be full of data and eventually
all operations would fail not just BackupPC.  I know this is true
because I watch the build up of data and then the change to more free
space once it is exported and moved. Right now BackupPC seems to be
focused on the Ubuntu Linux OS which is OKAY because that needs to be
regularly backed up even though there are only small changes.

Reason #2 is the BackupPC Pool which also manages the backed up files. I
have tried to keep "/" clear of excessive data build up, that is I check
to make sure that it does not build up and cause problems.

BUT as soon as I go to add clients from other machines I'm going to be
in a lot of trouble and I won't be able to control the build up of data
in the BackupPC Pool and also handle everything else on 192.168.1.16.
THEREFORE is it possible to MOVE the backup POOL to another machine
specifically set up for that purpose.

Reason #3 is sort of the same problem.  I need to be able to keep data
(NOT OS data but NEWS data in the form of web pages) on both DVD as well
as magnetic (drives hd, sd, etc.) media.  This means that there is a
duplication occuring.  Unfortunately for me I have experienced failure
of hard drives and external usb drives which resulted in major data loss
and even entire machines.  So now I am paranoid.  I lost weeks if not
months of data.  

What these 3 reasons come down to is that I have two kinds of data.  One
kind is the operating system and the data it generates as it executes
various processes.  The other kind of data is tons of web pages being
generated and then exported and then moved as well as copies on DVD.
This last category means there is a constant build up of NEW data.

The only incremental data would (I think) be that associated with the
Operating System as changes occur and temporary data changes get created
and then moved out of "/".

Since all machines are likely sooner or later (just like the magnetic
media within them) to die at the worst possible time I need to be able
to set up something along side of BackupPC (not in place of but in
addition to) in order to replace one machine with a new machine.  I have
been working on a GhostLinux "G4L" type of approach. But I want to get
BackupPC up and running first in order to deal with the CLIENTS. After
that I guess you cannot help me because BackuPC is not associated with
"G4L" -- right?

Hope all this makes sense.  

I certainly appreciate your help so far. This stuff was just a bit new
for me to feel confident about.

Thanks -- Ted Hilts
Looking forward to your response, Ted

 

backuppc-users AT lists.sourceforge DOT net


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/