[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Workshop to save M$ Windows users - help needed

From: Eduardo Ochs
Subject: Re: Workshop to save M$ Windows users - help needed
Date: Mon, 4 Oct 2021 00:06:52 -0300

On Sun, 3 Oct 2021 at 16:44, Tomas Hlavaty <> wrote:
> Emacs should be able to download the files even on Windows without
> installing wget, e.g. this should work:
> (eww "";)
> There is a variable url-cache-directory suggesting that caching can
> happen transparently.  Maybe you could even supply your own
> url-cache-creation-function.  Or figure out what functions you could
> reuse to simulate wget and your caching strategy in pure Emacs more
> closely.
> I know you want to do it manually, but why bother beginners with that?

Hi Tomas,

I've tried to implement something like what you are proposing. It
seems that the right tool for that, at least for the simplest
cases, is `url-retrieve-synchronously'. It is documented here:

  (find-node "(url)Retrieving URLs")
  (find-node "(url)Retrieving URLs" "url-retrieve-synchronously")
  (find-node "(url)Dealing with HTTP documents")

Straight.el uses `url-retrieve-synchronously' here:

Try this:

    'silent 'inhibit-cookies)

The "\n\n" will put the point after the response headers.

I thought that it would be trivial to write a "fake wget" for
eshell using `url-retrieve-synchronously'. The part of adding a
new builting seems to be easy, see:

  (find-node "(eshell)Built-ins" "eshell/ls")
  (find-eshellfile "")
  (find-eshellgrep "grep --color=auto -nH --null -e eshell/ *.el")

but I couldn't get the error checking right, and at some point I
stopped trying. Then I saw this thread,

and it convinced me that using a real wget would be the right
thing to do.

I have some code for playing with url-retrieve here, if you'd
like to try:

  (find-wgeta-elisp "";)

    Eduardo Ochs

reply via email to

[Prev in Thread] Current Thread [Next in Thread]