[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: map-file-lines

From: Stefan Monnier
Subject: Re: map-file-lines
Date: Mon, 02 Feb 2009 17:40:37 -0500
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/23.0.60 (gnu/linux)

>>> Emacs Lisp lacks a good way to iterate over all the lines of a file,
>>> especially for a large file.

SM> I'm not really happy about focusing on "line at a time".  It's a useful
SM> and common case, but Emacs usually is pretty good about being "line
SM> agnostic" (font-lock being an obvious counter example).

SM> Providing some kind of stream-processing functionality might be good,
SM> tho the need doesn't seem terribly high, since we've managed to avoid it
SM> until now.

> Without this function, Emacs simply can't handle large files and that's
> been requested at least 4 times by users that I can recall.  I think a
> general solution to the large file Emacs problem would be better, but
> line-oriented processing is a classic approach to processing large files
> that many Emacs users will probably find familiar.

I know about the large-file problem, obviously, but I wonder what kind
of UI you expect to provide, in order for it to be able to work just one
line at a time.

SM> FWIW, another option is to provide an open-file-stream along the same
SM> lines as open-network-stream.  I.e. the chunks are received via
SM> a process filter.

> How is that better than insert-file-contents as I use it?

It processes just one chunk at a time, with no need to keep the whole
file in memory.  A naive implementation could look like
(start-process "open-file-stream" nil "cat" file).

> Are you thinking of a stateful back/forward seek capability?  Or do
> you mean you'd like it to be asynchronous?

Just that the file is received/processed one chunk at a time, so you
never need to hold the whole file in memory at any one time.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]