duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] Fwd: AssertionError on every attempt


From: Rupert Levene
Subject: Re: [Duplicity-talk] Fwd: AssertionError on every attempt
Date: Tue, 9 Jun 2015 09:46:54 +0100

I also get duplicate files (same filename, same parent folder) in
Google Drive when using duplicity and have to remove them manually
before retrying. On large backups (thousands of volumes) this happens
quite often.

I think what is happening is that during a backup, duplicity checks
the size of a file on Google Drive after each upload, and sometimes
this call returns -1. (The original upload often appears to have
completed successfully on later inspection, so I think this is some
sort of lag issue in Google Drive). Duplicity thinks the upload didn't
work, so it re-uploads the file. On Google Drive, uploading a file
with the same name and parent folder doesn't overwrite the original
file, it creates a duplicate.

Maybe this could be fixed by asking the server to delete the original
upload (since duplicity believes it to be faulty) before reuploading?

Rupert

On 8 June 2015 at 21:05, Bruce Merry <address@hidden> wrote:
> On 8 June 2015 at 18:43,  <address@hidden> wrote:
>> On 08.06.2015 18:31, Bruce Merry wrote:
>>> Hi
>>>
>>> (apologies if this shows up twice - my first attempt was without
>>> subscribing to the list, and since it didn't show up in the list
>>> archives I'm assuming it went to the bit-bucket)
>>>
>>> I've somehow managed to get my Google Drive into a state where every
>>> time I run duplicity I get the error below. This occurred after I
>>> started a full backup, which crashed on volume 27, restarted it, and
>>> it later crashed again (I don't have the errors messages from those
>>> backups saved, sorry). I'm using the PyDrive backend, specifically
>>> https://code.launchpad.net/~bmerry/duplicity/pydrive-regular. I've
>>> tried
>>> - running duplicity cleanup (hits the same assertion)
>>> - wiping out the .cache/duplicity directory to force resync (resyncs,
>>> then hits the same assertion)
>>>
>>> I've since noticed that I have two copies of
>>> duplicity-full.20150607T094026Z.vol27.difftar.gpg in my Google Drive,
>>> which probably explains why I get the assertion error, but I'm not
>>> sure how to safely recover from this without ending up with a
>>> corrupted backup. Is it safe to just delete one of them?
>
> <snip>
>
>> are you sure that you have two identically named files on the backend? can 
>> you send a file listing of some kind?
>
> Yes, I've double-checked it in the GUI. I used a quick PyDrive script
> to get a list of all files in the drive which is attached, but it
> doesn't distinguish what directory everything is in (Google Drive
> seems only loosely tied to the idea of a filesystem), which also shows
> the duplicate.
>
> I've also discovered
> https://bugs.launchpad.net/duplicity/+bug/1462862, which I think is
> most likely the same thing I'm seeing. My (partial) manifest also has
> two Volume 27: lines, with the same starting and ending path but
> different SHA1 sums.
>
> Is it safe to manually delete all the files on the remote and in my
> cache directory corresponding to this backup set (*20150607T094026Z*)
> and just start the full backup from scratch?
>
> Thanks
> Bruce
> --
> Dr Bruce Merry
> bmerry <@> gmail <.> com
> http://www.brucemerry.org.za/
> http://blog.brucemerry.org.za/
>
> _______________________________________________
> Duplicity-talk mailing list
> address@hidden
> https://lists.nongnu.org/mailman/listinfo/duplicity-talk
>



reply via email to

[Prev in Thread] Current Thread [Next in Thread]