[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Bug-wget] Future of wget
From: |
kost BebiX |
Subject: |
[Bug-wget] Future of wget |
Date: |
Tue, 20 Apr 2010 13:34:27 +0400 |
Hello. I had an issue on using wget when I needed to recursively download my
site (I needed something like "spider" to get me all the pages of my site). So
I had a lot of url's, url directories, url rules that I would like to exclude
from download and so on. And it's just too complex for wget.
So right now there's nothing better then wget for parsing web page and getting
links to follow. So I think it would be best for wget to became more like
library that is calling your own triggers in python. What do you think about
that?
I mean, I want to write my own complex rules on what url's do I want to
download, I want to make my own printing of progress (like for my site I can
write how much left to download and so on) and maybe lots of other stuff, but I
want wget to download the page I tell him and get me links that I can follow
and information about them.
What do you think about this idea? Thank you.
--
jabber: address@hidden
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [Bug-wget] Future of wget,
kost BebiX <=