[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Bug-wget] HTTP gzip compression

Subject: [Bug-wget] HTTP gzip compression
Date: Mon, 15 Dec 2014 11:55:47 +0100

   Is there any way that http compression could be added to wget natively?


   I use wget to download lots of html pages.
   Downloading these pages uncompressed uses a lot of bandwidth for both
   me and the server.
   For example a 2GB uncompressed html download would take only ~400MB
   compressed with gzip.

   I know that setting --header="accept-encoding: gzip" in wget gets the
   server to return gzip html,
   but wget cannot parse this downloaded content for more links because
   wget has no internal decompressor.
   This means wget functionality is greatly reduced when using this

   HTTP compression would be a very useful feature that would make wget
   much more friendly to webmasters
   by saving a lot of bandwidth.

   I understand that wget is freely made by people donating their own
   time, I am not demanding anything,
   merely putting forth a suggestion that the developers might like. I
   hope I have come across as civil
   and polite.

   Thank you for your time.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]