[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Multi segment download

From: Yousong Zhou
Subject: Re: [Bug-wget] Multi segment download
Date: Wed, 9 Sep 2015 18:20:08 +0800

On 9 September 2015 at 11:20, Hubert Tarasiuk <address@hidden> wrote:
> On Sat, Aug 29, 2015 at 12:50 AM, Darshit Shah <address@hidden> wrote:
>> Thanking You,
>> Darshit Shah
>> Sent from mobile device. Please excuse my brevity
>> On 29-Aug-2015 1:13 pm, "Tim Rühsen" <address@hidden> wrote:
>>> Hi,
>>> normally it makes much more sense when having several download mirrors and
>>> checksums for each chunk. The perfect technique for such is called
>> 'Metalink'
>>> (more on www.metalinker.org).
>>> Wget has it in branch 'master'. A GSOC project of Hubert Tarasiuk.
>> Sometimes the evil ISPs enforce a per connection bandwidth limit. In such a
>> case, multi segment downloads from a single server do make sense.
>> Since metalink already has support for downloading a file over multiple
>> connections, it should not be too difficult to reuse the code for use
>> outside of metalink.
> The current Metalink impl in Wget will not download from multiple
> mirrors simultaneously since Wget itself is single-threaded.
> Adding optional (POSIX) threads support to Wget (especially for the
> Metalinks) could be perhaps worth discussion.
> For now the solution might be to start multiple Wget instances using
> the --start-pos option and somehow limit the length of download (I am
> not sure if Wget currently has an option to do that).

As said in the discussion when we were about to introduce --start-pos
option, we can limit the length of download with other utilities such
as dd.  This is for the consideration of complexity.

Well, I just made a proof of concept shell script for starting
multiple wget processes to download HTTP files [1].

[1] Concurrent WGET with --start-pos option.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]