[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Bug-wget] wget in a 'dynamic' pipe
From: |
Paul Wagner |
Subject: |
[Bug-wget] wget in a 'dynamic' pipe |
Date: |
Thu, 19 Jul 2018 17:24:06 +0200 |
User-agent: |
Posteo Webmail |
Dear wgetters,
apologies if this has been asked before.
I'm using wget to download DASH media files, i.e. a number of URLs in
the form domain.com/path/segment_1.mp4, domain.com/path/segment_2.mp4,
..., which represent chunks of audio or video, and which are to be
combined to form the whole programme. I used to call individuall
instances of wget for each chunk and combine them, which was dead slow.
Now I tried
{ i=1; while [[ $i != 100 ]]; do echo
"http://domain.com/path/segment_$((i++)).mp4"; done } | wget -O foo.mp4
-i -
which works like a charm *as long as the 'generator process' is finite*,
i.e. the loop is actually programmed as in the example. The problem is
that it would be much easier if I could let the loop run forever, let
wget get whatever is there and then fail after the counter extends to a
segment number not available anymore, which would in turn fail the whole
pipe. Turns out that
{ i=1; while true; do echo
"http://domain.com/path/segment_$((i++)).mp4"; done } | wget -O foo.mp4
-i -
hangs in the sense that the first process loops forever while wget
doesn't even bother to start retrieving. Am I right assuming that wget
waits until the file specified by -i is actually fully written? Is
there any way to change this behavour?
Any help appreciated. (I'm using wget 1.19.1 under cygwin.)
Kind regards,
Paul
- [Bug-wget] wget in a 'dynamic' pipe,
Paul Wagner <=