bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] How to download all the links on a webpage which are in s


From: Giuseppe Scrivano
Subject: Re: [Bug-wget] How to download all the links on a webpage which are in some directory?
Date: Mon, 01 Aug 2011 15:36:04 +0200
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/24.0.50 (gnu/linux)

Peng Yu <address@hidden> writes:

> Suppose I want download  www.xxx.org/somefile/aaa.sfx and the links
> therein (but restricted to the directory www.xxx.org/somefile/aaa/)
>
> I tried the option  '--mirror -I /somefile/aaa', but it only download
> www.xxx.org/somefile/aaa.sfx. I'm wondering what is the correct option
> to do so?

it looks like the right command.  Can you check using "-d" what is going
wrong?

Cheers,
Giuseppe



reply via email to

[Prev in Thread] Current Thread [Next in Thread]