|
From: | Miquel Llobet |
Subject: | [Bug-wget] [bug #30999] wget should respect robots.txt directive crawl-delay |
Date: | Thu, 09 Apr 2015 15:27:19 +0000 |
User-agent: | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/600.5.17 (KHTML, like Gecko) Version/8.0.5 Safari/600.5.17 |
Follow-up Comment #5, bug #30999 (project wget): I have read the robots.txt spec thoroughly and found no way to set crawl-delay for a specific file. If someone could look into it that would be nice. Otherwise I think the best solution is to set --wait to the matching crawl-delay if the user hasn't set --wait already. _______________________________________________________ Reply to this item at: <http://savannah.gnu.org/bugs/?30999> _______________________________________________ Message sent via/by Savannah http://savannah.gnu.org/
[Prev in Thread] | Current Thread | [Next in Thread] |