bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Support for long-haul high-bandwidth links


From: Paul Wratt
Subject: Re: [Bug-wget] Support for long-haul high-bandwidth links
Date: Thu, 1 Dec 2011 06:25:13 +1300

another command line option possibly?

Paul

2011/11/30 Andrew Daviel <address@hidden>:
> On Sat, 26 Nov 2011, Ángel González wrote:
>
>> On 10/11/11 03:24, Andrew Daviel wrote:
>>>
>>>
>>> When downloading a large file over a high-latency (e.g. long physical
>>> distance) high-bandwidth link, the download time is dominated by the
>>> round-trip time for TCP handshakes.
>>>
>>> Using the "range" header in HTTP/1.1, it is possible to start multiple
>>> simultaneous requests for different portions of a file using a
>>> standard Apache server, and achieve a significant speedup.
>>>
>>> I wondered it this was of interest as an enhanscement for wget.
>>
>>
>> I think setting a big SO_RCVBUF should also fix your issue, by using big
>> window sizes, and it's cleaner.
>> OTOH, you need support from the TCP stack, and that won't trick
>> per-connection rate limits that may be
>> limiting you in the single-connection case.
>
>
> Yes, jumbo frames work well over a private link like a lightpath. I'd been
> thinking of something that would work on the unimproved public internet.
>
> I had been thinking of speeding up transfers to e.g. a WebDAV repository on
> another continent, but I became recently aware of "download accelerators"
> designed primarily to thwart bandwidth allocation/throttling. Interestingly
> Wget is listed on the Wikipedia page as a "download manager", implying it
> can already do this.
>
> http://en.wikipedia.org/wiki/Download_acceleration
>
>
> --
> Andrew Daviel, TRIUMF, Canada



reply via email to

[Prev in Thread] Current Thread [Next in Thread]