> From: Richard Copley <address@hidden>
> Date: Wed, 15 May 2019 07:46:12 +0100
> Cc: Noam Postavsky <address@hidden>, address@hidden
> I don't see how disabling decoding could make sense, can you explain?
> Not in detail, it's not an area of expertise of mine. We call
> (decode-coding-region (point-min) (point-max) 'undecided) on the
> payload of "https://elpa.gnu.org/packages/archive-contents",
> which is raw text. The resulting buffer's buffer-file-coding-
> system is iso-latin-1-dos.
> What does this code do on GNU/Linux?
> The same. The resulting coding system is iso-latin-1-unix.
That URL seems to bring ASCII text. Are you saying that GPG produces
a wrong signature because EOL format is significant for it? (Please
forgive silly questions about GPG: I seldom if ever use it.)
Getting the signature involves applying a hash function to the bytes
in question. It's desirable that two different byte sequences give rise
to two different signatures, even if the difference is a carriage return.
In any case, if we don't want EOL conversion, we should bind
inhibit-eol-conversion to a non-nil value, and change nothing else.
But this should not be done in url-insert-buffer-contents, it should
be done in package.el, because the former is a general utility and not
necessarily needs to inhibit EOL conversion for all of its callers.
Of course. I was confirming Noam's hunch, not suggesting a change.
Sorry that wasn't clear.