emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: warn-maybe-out-of-memory


From: Eli Zaretskii
Subject: Re: warn-maybe-out-of-memory
Date: Fri, 11 Jul 2014 12:02:51 +0300

> Date: Fri, 11 Jul 2014 12:43:39 +0400
> From: Dmitry Antipov <address@hidden>
> CC: address@hidden
> 
> On 07/11/2014 10:50 AM, Eli Zaretskii wrote:
> 
> > But if this warning is to be useful, it should catch the majority of
> > the cases, otherwise we are just wasting cycles.  With the current
> > test, many cases of visiting files that cannot be accommodated will go
> > undetected, thus rendering all this feature useless.
> 
> With this feature, I'm just trying to redirect user to `find-file-literally'
> if file looks so large that `find-file-noselect' is likely to run out of
> memory.  What other cases do you mean?

I mean the cases where the file size is borderline, near the available
memory, but slightly less than that.

> > Since it's only a warning that doesn't prevent visiting a file, it is
> > better to err on the other side of the truth.  E.g., apply some
> > heuristics as to the average expansion factor, perhaps dependent on
> > the locale.
> 
> IIUC there are no reliable heuristics here.

Heuristics doesn't have to be reliable, just plausible.  Erring on the
"safe side", which in this case means emitting the warning even when
the file is actually not too large, is OK -- it's only a warning.

> For example, produce_charset and compose_text may allocate
> substantial amount of memory to represent `composition' and
> `charset' extra properties

Then by all means add that memory to the estimation, at least in
locales that frequently do that.  Or maybe even always.

> and we need to read and decode everything to find all places where
> such an extra properties should be attached.

That's not the intent, granted.

> > And don't forget that even for plain-ASCII files we
> > allocate the gap in addition to the text itself, so the mere size of
> > the file is simply _never_ the correct value, it is always an
> > underestimation.  IOW, this test is always biased towards lower
> > values.
> 
> Unless someone still uses a machine with 64K RAM, initial gap size
> is very small in comparison with file sizes we're talking about.

It is indeed very small, but it can easily cause the needed memory to
cross the border.  Adding it will catch these borderline cases, which
AFAIU is the purpose of this feature.

> We don't need to do "something like that", i.e. we don't need a precise
> estimation.  OSes are tricky about memory management; overall VM picture
> may drastically change when file reading is in progress, etc.  With a lot
> of constraints you can't predict, estimate and control, precise estimation
> just makes no sense.

Then how does this feature make sense?  It is, according to you,
unpredictable and uncontrollable.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]