On 16.04.2013 18:05, Elvar wrote:
I am currently using Duplicity to make backups of a fast growing email archive
solution. I have Duplicity backing the data up via FTP to an offsite server. I
performed the initial full backup and have been doing incrementals since. I'm
using 250M volumes to try and cut down on the number of files on the remote
server. The question I have is, is this a viable long term method I'm using?
Performing semi routine full backups is not an option due to how long they take
and the amount of data that has to be transferred.
no. currently when one signature/volume becomes corrupt all following backups
become unusable as well. so you either
1. have to do full on a regular schedule
or
2. doing new backups against an old full by moving incrementals manually somewhere else
on the backend (and back if you want to restore a backup contained in them). NOTE: this
is a hack and not advised, but the only way currently to "rebase" incrementals.
also, in #2 you'd assume that your full will never get corrupted, which is
probably not very clever either.
..ede/duply.net