Hi,
I see. Fascinating story. But it probably then is too late to fix that issue
now (old wget, old servers, old antivirus on old router - they are all possible
sources of errors.)
Also, as was pointed out here by others (was it Tim?) as well as me: The continuation
mechanism by its very nature will always be vulnerable to a number of possible conditions
- some errors. others just "life happening" (like files being changed between
the downloads. So if and when you use the -c option, be aware that what you get may not
be what you want/need and may in fact be broken. So check your results.
Your examples are all 7z archives. Most archive files are easily checked, since
they give you an error when you unpack them. I am not so sure about iso files,
but the ones I dealt with so far had md5 checksums available and I highly
recommend using them.
All in all, my feeling is that there is nothing here that needs fixing,
probably nothing much that can be fixed. And then the situation probably has
improved a lot since then, with most servers now handling the protocol
correctly and disruptions less likely due to better (and faster) networks. Plus
you no longer share your computer with others (I hope.)
As I suggested in my previous post, there is one possibility to check for error
situations. It is not 100% certain, but it may be very helpful in many cases. I
call it the overlapping stitching. It would have to be incorporated into wget
(or rather wget2, since it is a new feature.)
Start your resumed download with the last few bytes that you already have and
compare. If they don't match you are obviously not downloading from the same
file. Possible actions are to break off with an error message,. to start the
download from the beginning either abandoning or saving the old fragment.
Also remember that some file types will easily reveal if the are broken (most
media files and compressed archives, for instance.) Others may be editable or
salvageable in another way (typically text file or some types of office
documents.) Others, such as any executables, could be rather disastrous when
used without checking (think of a dll with some critical function that runs
into execution exceptions; that could bring down your system for good.)
In closing, let me remind us older ones of the days of the modem or acoustic coupler,
when after hours of downloading you had 60% of the file, the next attempt starting from
byte 0 again and making it to the 20% mark and the third one not even getting to 10% ...
for all those files out there that were static it was a huge achievement to have a tool
that would - like a honey badger - not let go until it had the whole thing. And that is
why I started using wget and to this day never stopped. And with a bit of "batch
scripting" you can also save yourself a lot of work ...