|
From: | Paul Eggert |
Subject: | Re: default large-file-warning-threshold |
Date: | Sun, 30 Nov 2014 21:46:38 -0800 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.2.0 |
Óscar Fuentes wrote:
If a feature needs to scan the whole buffer, and the scanning is not trivial, visiting an 800 MB file would be unpleasant.
If the scanning is sufficiently nontrivial, the same argument would apply to a 10 MB file, no? So perhaps large-file-warning-threshold should be mode- or feature-dependent, as well as dependent on physical RAM size.
I agree that Emacs is too slow when used in default mode on large text files. On my circa-2011 desktop if I want reasonable efficiency on a multigigabyte text file I typically use find-file-literally, because plain find-file makes me wait for tooooo maaaannny seconds. There's no reason in principle that Emacs must be that slow on large files, it's just that performance on large files has not been that high a priority, and the tiny default large-file-warning-threshold is to some extent a symptom of this.
Here's a little shell script that lets me reproduce the performance problem, if you're interested:
yes '#define assume_initialized(val) asm ("" : "=X" (val))' | head -n 30000000 >t time emacs -Q -nw --execute '(find-file "t")' --kill time emacs -Q -nw --execute '(find-file-literally "t")' --killOn my desktop the find-file version takes 8.06 user CPU seconds and the delay is really annoying, whereas the find-file-literally version takes only 0.04 user CPU seconds. It's like night and day.
[Prev in Thread] | Current Thread | [Next in Thread] |