bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] avoiding a large number of HEAD reqs when resuming


From: UukGoblin
Subject: Re: [Bug-wget] avoiding a large number of HEAD reqs when resuming
Date: Fri, 1 May 2015 11:54:42 +0000
User-agent: Mutt/1.4.2.3i

On Fri, May 01, 2015 at 12:11:34AM +0200, Giuseppe Scrivano wrote:
> have you had a look at --wait, --waitretry and --random-wait?
> 
> Maybe this is enough for circumventing your firewall, even though it
> will slow down the download process.

Yes, I'm aware of these. They can indeed help circumventing the firewall,
but they won't help with the original problem(s) of optimizing wget's
behaviour on resume or distributing the load onto many IPs.

In order to figure out the correct delays, I usually have to run wget
a few times, get blocked after each try, then try longer --waits, see
if that helps, etc. I'd prefer each such run to retrieve valuable data
rather than re-request HEADs of the files I already have.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]