[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [aarch64] sbcl build failure

From: Leo Famulari
Subject: Re: [aarch64] sbcl build failure
Date: Wed, 31 Mar 2021 11:29:21 -0400

On Wed, Mar 31, 2021 at 01:26:20AM -0400, Leo Famulari wrote:
> > I'm looking at the following cl-zstd build failure:
> >
> > which is caused by sbcl build failure that I
> > reproduced locally.

Unfortunately, the build failed on the Overdrive 1000, basically as
shown in the build log of job 133326:

//testing for consistency of first and second GENESIS passes
//header files match between first and second GENESIS -- good

real    38m6.607s
user    35m38.723s
sys     2m26.799s
//doing warm init - compilation phase
This is SBCL 2.1.3, an implementation of ANSI Common Lisp.
More information about SBCL is available at <>.

SBCL is free software, provided as is, with absolutely no warranty.
It is mostly in the public domain; some portions are provided under
BSD-style licenses.  See the CREDITS and COPYING files in the
distribution for more information.
Initial page table:
Gen  Boxed   Code    Raw  LgBox LgCode  LgRaw  Pin       Alloc     Waste        
Trig      WP GCs Mem-age
 6     398    251      0      0      0      0    0    42470112     62752     
2000000     649   0  0.0000
           Total bytes allocated    =      42470112
           Dynamic-space-size bytes =    3221225472
COLD-INIT... (Length(TLFs)= 9815)
Argh! corrupted error depth, halting
fatal error encountered in SBCL pid 579 tid 579:
%PRIMITIVE HALT called; the party is over.

Error opening /dev/tty: No such device or address
Welcome to LDB, a low-level debugger for the Lisp runtime environment.
real    0m0.186s
user    0m0.145s
sys     0m0.041s
command "sh" "" "clisp" 
"--dynamic-space-size=3072" "--with-sb-core-compression" 
"--with-sb-xref-for-internals" failed with status 1
builder for `/gnu/store/zij71vlv5a93qs2bmz29qwy1byz076iy-sbcl-2.1.3.drv' failed 
with exit code 1

reply via email to

[Prev in Thread] Current Thread [Next in Thread]