|
From: | Andrew Daviel |
Subject: | [Bug-wget] Support for long-haul high-bandwidth links |
Date: | Wed, 9 Nov 2011 18:24:38 -0800 (PST) |
When downloading a large file over a high-latency (e.g. long physical distance) high-bandwidth link, the download time is dominated by the round-trip time for TCP handshakes.
In the past tools such as bbftp have mitigated this effect by using multiple streams, but required both a special server and client.
Using the "range" header in HTTP/1.1, it is possible to start multiple simultaneous requests for different portions of a file using a standard Apache server, and achieve a significant speedup. I have a proof-of-principle Perl script using threads which was able to download a medium-sized file from Europe to Vancouver in half the normal time.
I wondered it this was of interest as an enhanscement for wget. regards -- Andrew Daviel, TRIUMF, Canada
[Prev in Thread] | Current Thread | [Next in Thread] |