Hi All,
Firstly, the good news, it is all solved (for me), so this is just here
as a "If this happens to you, maybe this will work for you", and a
potential help to find/fix a bug.
I was running backuppc 3.3.x for some years, and it was working really
well, however some servers had a "standard" workload which involved
copying large number of photos from a camera to a "tmp" directory, and
then copying and/or moving these photos to other folders. In addition,
from time to time the situation would happen where a folder with 100,000
or more photos (in sub-directories) was renamed. Using rsync with
backuppc 3.3.x meant that every time the path (filename or folder name)
changed the file was downloaded over the slow remote connection, and
then discarded and linked back to the old file in the pool/cpool (this
is all as designed).
So, to solve that issue, about a year ago I upgraded to BackupPC v4, and
it works much better in that it avoids re-download the same file when
the path changes.
However, for hosts that I had old pre-existing backups, I was getting
some number of "Errors" during the backup, which referred to being
unable to find the pool files. Mostly I just cleared the pc/hostname
folder, and it started making fresh backups, and that solved the problem.
For one host, I didn't do that. So for the past 12 months since the
upgrade, I've been getting about 50,000 errors on each backup.
Last week, the host was exposed to one of the crypto locker variants,
and it managed to encrypt a lot of files (approx 65,000). I thought OK,
easy, I'll just restore the files from backup.
Problem 1)
I could easily generate a text file which contained a list of filenames
that needed to be restored (find / -name *.encrypted | sed -e
s/.encrypted$// > /tmp/filelist_to_restore). However, I couldn't easily
feed this to any of the backuppc tools to build a tar file with only
those files. In the end, I used BackupPC_tarCreate to restore all files
from the most recent backup, passed that to tar extract but only the
files in the list, and then passed that to tar create, compress, and
save the file. Then I could transfer the tar file to the remote, and
extract easily into place (and delete the encrypted version of each file).
This seemed to solve the problem, and everything carried on, until the
next day
Problem 2)
Some files that were restored were "corrupted". Eventually, it was found
that these "corrupt" files were exactly the right permission, date/time,
and size, but the contents were wrong (full of null (binary 0)). The
affected files were found to correlate to errors during the restore
process, along the lines of can't find /dev/null or similar. Could
probably find the right error message if someone is interested.
So, eventually, since the timestamp was correct (and perm/size), I could
test all the files, find which ones had all null content, and then found
the answer I was hoping for. All the corrupted files were ones that had
not been modified since before the upgrade. The solution was now simple,
restore the files from an older backup which had happened prior to the
upgrade of backuppc.
Moving forward, the solution was also simple, move the pc/<hostname>
folder away, and let backupc start fresh, and the backups are now
working correctly.
So, I think the above sums it up, obviously each step was full of
additional fear that all was lost along the way, but, it all turned out
well in the end.
If you are thinking of upgrading from 3.3 to 4, then my personal
suggestion would be start with a clean pc directory and pool/cpool,
although it did work for some hosts without any issues.
Regards,
Adam
--
Adam Goryachev Website Managers www.websitemanagers.com.au
------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|