[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: wip-rtl status

From: Ludovic Courtès
Subject: Re: wip-rtl status
Date: Mon, 04 Jun 2012 00:30:40 +0200
User-agent: Gnus/5.110018 (No Gnus v0.18) Emacs/24.0.93 (gnu/linux)

Hi Andy,

Woow, this is dense.  ;-)

Andy Wingo <address@hidden> skribis:

> I started to look at static constant allocation.  In order to do so I
> needed an object format that could link together different sections with
> different permissions and alignments, so I imported an old ELF branch I
> had hanging around, and adapted it to work with wip-rtl.  Now the
> interface is that you make an assembler, emit some programs, then link,
> resulting in a list of ELF objects.  You then link-elf to produce a
> bytevector, which can be written to disk, then loaded with
> load-thunk-from-disk from (system vm objcode).  You can also load via
> load-thunk-from-memory, which takes a bytevector.  In that case you
> might want to skip alignment and permissions and just operate on
> read-write memory, so there is the #:page-aligned? kwarg to link-elf for
> that situation.

Sounds cool!

Forget it if it’s too late, but would it make sense to first work on ELF
as the container format and merge that in ‘master’, then merge wip-rtl
on top?  It would be easier to digest.  ;-)

> Instructions that take constants take them literally.  The various
> emit-foo procedures will add the constant to a table if necessary, and
> link-objects will serialize a constant table if necessary.  Not sure if
> this is being too clever or not.

I suppose the result is the same, whether the constant table is built at
a level equivalent to GLIL or at some lower level.  So it may be more a
question of which one is more easily or elegantly implemented?

> Currently there are a couple of broken bits.  One is that we don't
> currently write an init thunk, to fix up the car and cdr links in static
> pairs, for example.

Static pairs, as constant pairs stored on-disk?

Keep up the good hack, and let us know how it goes!  :-)


reply via email to

[Prev in Thread] Current Thread [Next in Thread]