[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Regression in dump-emacs-portable

From: Lynn Winebarger
Subject: Re: Regression in dump-emacs-portable
Date: Thu, 16 Feb 2023 04:54:33 -0500

On Thu, Feb 16, 2023 at 4:31 AM Lynn Winebarger <owinebar@gmail.com> wrote:
> On Wed, Feb 15, 2023, 7:43 AM Eli Zaretskii <eliz@gnu.org> wrote:
>> > From: Lynn Winebarger <owinebar@gmail.com>
>> > Date: Tue, 14 Feb 2023 18:26:07 -0500
>> > Cc: emacs-devel@gnu.org
>> >
>> > On Tue, Feb 14, 2023 at 9:23 AM Eli Zaretskii <eliz@gnu.org> wrote:
>> > >
>> > > What do these tests actually test?
>> >
>> > Whether libraries expected to be redumpable are in fact redumpable.
>> That's the goal, not the actual testing algorithm.  I asked about the
>> latter.  How do you intend to test that a dump succeeded (assuming
>> there's no crash)?
> The dump will have to be performed in a separate emacs process. The easiest 
> criteria to judge is whether the dump file exists and is greater than 0 
> bytes.  Emacs appears to create a 0 byte file when dump-emacs-portable is 
> invoked, which is just not updated if the dump terminates unsuccessfully.
> A second criteria is then to invoke emacs with the dump-file and evaluate 
> some simple expression to verify no unexpected  errors were encountered on 
> load.
> I've started automating my process with some simple shell scripting tracked 
> at https://github.com/owinebar/emacs-redumping.  It's not much yet, but at 
> least I was able to align my efforts between 28.2 and 30.0.50.  The next step 
> will be to create a proper load-time dependency graph, so I can automate the 
> calculation of the minimal list of features that need to be provided so that 
> the maximum number of libraries can be loaded for the dump, with the 
> artificially provided features loaded on an after-init hook (because 
> before-init happens prior to the X frame initialization).
> Once these dependencies are identified and lists are calculated, then 
> creating a set of canned tests should be straightforward.  Some 
> makefile-based approach should be adequate for determining which parts of the 
> dependency graph need to be recalculated after an update.
> I want to calculate these dependencies (and compile-time dependencies) to 
> construct a more robust native-compilation build process anyway.
> For a regression test, I would want to record the results from 28.2 as a 
> basis for measuring 29 and 30, at least as a starting point.   In any case, I 
> never see an "abort signal" termination in 28, or even a "weird pseudovector" 
> message.  It's either something incompatible (because I blindly attempted to 
> load the world) as in the "term" subdirectory or dos/w32 libraries under 
> linux, or some redefinition of a character table (which is why I calculate 
> the files loaded in the baseline dump and exclude them).  I got some very 
> lengthy error messages printing out explicit objects from some obsolete 
> libraries, so I exclude them as well.
> And viper demands user input at startup when it's loaded, so it has to be 
> excluded from dumping.  There might be some variable to turn off that 
> annoying behavior, I'm just not interested in investigating.
>> > Almost every library in 28.2 could be redumped, excepting those which
>> > simply failed to load for whatever reason.
>> Don't we have Lisp objects that cannot be dumped?  If we do, then not
>> every library could be dumped even in principle.
> In 28.2, using dump-emacs-portable, the answer is, not many in the libraries 
> in included in the Emacs source distribution.  I excluded the term and 
> obsolete subdirectories from generating the set of libraries to dump (but not 
> from the final set determined from load-history).  Even outside of the emacs 
> distribution, the only problematic objects are dynamic modules.  I assume 
> this is due to dumping in batch-mode.  My exclusion on wid-edit.el is because 
> dumping it in batch-mode appears to bar it from ever subsequently creating 
> proper buttons in a graphic terminal.   But dumping it still succeeds.
>> Another potential issue with this is (assuming you suggest to actually
>> try dumping every library) that it will take too long, and thus will
>> be likely to be skipped in any "normal" run of the test suite, thus
>> missing the point.
> My 2017-vintage laptop dumps the 1252 files, including all of leim, in 34 
> seconds, for a 135MB dump file.
> When I added leim to the exclusions list,  1172 libraries are dumped in 24 
> seconds for a 83MB dump file, which explains why my effort with 30.0.50 
> produces a 75MB dump.  I excluded leim for 30.0.50 because I was encountering 
> too many errors to deal with manually, which explains most of the size 
> reduction.
> I'm not sure how the tests are normally run, but I would think anyone working 
> on pdumper should be interested in a comprehensive test at some point.  Aside 
> from testing on a per-commit basis, isn't there a more comprehensive set of 
> regression tests run pre-release? Does emacs have a CI process regularly 
> running the test suite, or is it more ad hoc?  If nothing else, failures 
> reported from such a routine run could be used to create a more targeted test 
> set for someone actively working on pdumper.

Just to finish this thought - dumping the full set of libraries,
excluding a few expected to fail, should be the "normal" test.  If
24-34 seconds is too long, there are probably other large subsets that
provide substantial coverage in less time. The more comprehensive
file-by-file approach should be reserved for tracking down the cause
of failures in the normal test.  Theoretically, pdumper might be able
to indicate the source library(ies) associated with particular error,
but the file-by-file approach is always available.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]