BackupPC-users

[BackupPC-users] [OT] Perl hacking (was: Re: adding default exclude list to all your share/host)

2009-09-11 23:54:01
Subject: [BackupPC-users] [OT] Perl hacking (was: Re: adding default exclude list to all your share/host)
From: Holger Parplies <wbppc AT parplies DOT de>
To: ADNET Ghislain <gadnet AT aqueos DOT com>
Date: Sat, 12 Sep 2009 05:49:06 +0200
Hi,

ADNET Ghislain wrote on 2009-09-06 10:47:30 +0200 [[BackupPC-users] adding 
default exclude list to all your share/host]:
> [...]
> Here is the code. I just wanted to share if this can help someone. It 
> seems to work for me (my perl skill is VERY low so be kind):
> 
>    while ( my ($key,$value) = each(%{$Conf{BackupFilesExclude}}) ) {
>        push (@{$value}, '**/logs/**' );
>        push (@{$value}, '**/log/**' );
>        push (@{$value}, '**/cache/**' );
>        push (@{$value}, '**/tmp/**' );
>        push (@{$value}, '**/temp/**' );
>        push (@{$value}, '**/typo3temp/**' );
>    }
> 
> 
> This of course can be used for default include. I just need to find a 
> way to prevent duplicate add but i do not know if rsync will be bothered 
> to have x time the same exclude in his list.

if you want it in one line, try ...

  $Conf{BackupFilesExclude} = {map {($_,[keys %{{map {($_,1)} 
@{$Conf{BackupFilesExclude}{$_}}, map {"**/$_/**"} qw/log logs tmp temp 
typo3temp cache/}}])} keys %{$Conf{BackupFilesExclude}}};

;-), but it's (slightly) more readable as ...

  $Conf{BackupFilesExclude} = {
    map { ($_, [keys %{{map {($_, 1)} @{$Conf{BackupFilesExclude}{$_}},
                                      map {"**/$_/**"}
                                          qw/log logs tmp temp typo3temp cache/
                      }}
               ]
          )
        }
        keys %{$Conf{BackupFilesExclude}}
  };

This makes use of the fact that a hash cannot have more than one entry with
the same key, so adding all items (as keys, with a dummy '1' value) to a hash
(that's the "map { ($_, 1) }") and then extracting the keys gets rid of
duplicates. It puts them in random order, though, which doesn't matter, unless
you are doing fancy things like putting includes in BackupFilesExclude
("+ /some/path") ...

If you want to avoid reordering, you could use something like

  foreach my $value (values %{$Conf {BackupFilesExclude}}) {
    push @$value, grep { my $p = $_; not grep { $p eq $_ } @$value }
                       map {"**/$_/**"} qw/log logs tmp temp typo3temp cache/;
  }

which will add ("push") those values ("grep") that are not yet ("not grep") in
the array. You'll note that without the "grep { ... }" this is exactly your
example, written in a slightly different way ($key is not used, so 'values'
instead of 'each'; push several values at once; map for abbreviation). As
always, there are many ways to get the job done. Choose whichever you feel
most comfortable with.

And, no, I believe rsync doesn't mind duplicate excludes, so you can just
leave things as they are.

Regards,
Holger

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>