[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Workshop to save M$ Windows users - help needed

From: Tomas Hlavaty
Subject: Re: Workshop to save M$ Windows users - help needed
Date: Mon, 04 Oct 2021 19:34:46 +0200

On Mon 04 Oct 2021 at 00:06, Eduardo Ochs <> wrote:
> On Sun, 3 Oct 2021 at 16:44, Tomas Hlavaty <> wrote:
>> Emacs should be able to download the files even on Windows without
>> installing wget, e.g. this should work:
>> (eww "";)
>> There is a variable url-cache-directory suggesting that caching can
>> happen transparently.  Maybe you could even supply your own
>> url-cache-creation-function.  Or figure out what functions you could
>> reuse to simulate wget and your caching strategy in pure Emacs more
>> closely.
>> I know you want to do it manually, but why bother beginners with that?
> Hi Tomas,
> I've tried to implement something like what you are proposing. It
> seems that the right tool for that, at least for the simplest
> cases, is `url-retrieve-synchronously'. It is documented here:
>   (find-node "(url)Retrieving URLs")
>   (find-node "(url)Retrieving URLs" "url-retrieve-synchronously")
>   (find-node "(url)Dealing with HTTP documents")
> Straight.el uses `url-retrieve-synchronously' here:
> Try this:
>   (find-ebuffer
>    (url-retrieve-synchronously
> "";
>     'silent 'inhibit-cookies)
>    "\n\n")
> The "\n\n" will put the point after the response headers.
> I thought that it would be trivial to write a "fake wget" for
> eshell using `url-retrieve-synchronously'. The part of adding a
> new builting seems to be easy, see:
>   (find-node "(eshell)Built-ins" "eshell/ls")
>   (find-eshellfile "")
>   (find-eshellgrep "grep --color=auto -nH --null -e eshell/ *.el")
> but I couldn't get the error checking right, and at some point I
> stopped trying. Then I saw this thread,
> and it convinced me that using a real wget would be the right
> thing to do.
> I have some code for playing with url-retrieve here, if you'd
> like to try:
>   (find-wgeta-elisp "";)
>   Cheers,
>     Eduardo Ochs

I see, url-retrieve-synchronously is a beast.  You could write something
simpler and keep track of the status.  Something like:

;;; -*- lexical-binding: t -*-

(defun url-retrieve-synchronously2 (url filename)
  (let ((b (find-file filename))
        (again t))
    (cl-flet ((save (status done)
                    ;;(message "save %s %s" status done)
                    (setq z status)
                    (if done
                        (setq again nil)
                      (when status
                        (with-current-buffer b
                          (let ((coding-system-for-write 'raw-text-unix))
                            (write-region (point-min) (point-max) 
      (with-current-buffer b
        (url-retrieve url #'save '(t)))
      (while again
        (save nil nil)
        (sleep-for 1))
      ;;(message "done")

(url-retrieve-synchronously2 ""; "/tmp/e2")
(url-retrieve-synchronously2 "https://logand.com1"; "/tmp/e2")

This works for me (saves whole http response), returns plist and saves
the data into file on success, otherwise on failure returns nil.

Anyway, I was trying to suggest something even simpler: use eww and
customize it so that it caches the downloaded files.  This is what you
are doing in shell manually.

Or even better, bind for example M-f to ffap so that people can simply
press M-f on a url and Emacs will take them there.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]