[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: stack closures for guile-log

From: Stefan Israelsson Tampe
Subject: Re: stack closures for guile-log
Date: Wed, 16 May 2012 22:12:19 +0200


Yes, I think that if one go for capturing the stack via copying frames then that's the way
to go. This makes rewinding and unwinding fast. on the other hand, currently in guile-log
we use very small linked frames so that essentially all the stored information is lying in the
heap compressed to a tree and is therefore superior memory wise, because of the compression and because the GC get better oppertunities to reclaim memory.

However I need to find more bench mark more in order to understand this better.

I also must add hooks to handle a satck with stored coses. They serve as a derefernce and identity objects for the closures which will be reference throughout the heap. I cannot move
the conses but I can release the references to them and allocate new conses in essence
move a cons from the stack to the heap. This is quite neat but do decrease the possibilities of reclaiming memory in some cases so conses on the stack will be an expert option but is needed to get the last 2x - 3x speedup down to what compiled prolog can do.

The bulk of the closures stack is quite simple in it's functioning and there I think it's
a bit over kill to use a dynstack.

Thanks for the info and Cheers!

On Tue, May 15, 2012 at 10:37 PM, Andy Wingo <address@hidden> wrote:
On Tue 08 May 2012 21:16, Stefan Israelsson Tampe <address@hidden> writes:

> I have three stacks
> 1. a control stack that stores undo information scheme hooks aka
> dynamic-wind and stack references to the other stacks.

Have you seen dynstack.[ch] on master?

> 2. stack from which data is allocated from like the bulk of a closure
> and special pairs
> 3. a cons stack, e.g. an array of allocated conses

I wonder, can you implement these too using something like the dynstack?


reply via email to

[Prev in Thread] Current Thread [Next in Thread]