[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Chicken-users] String compression?

From: john
Subject: Re: [Chicken-users] String compression?
Date: Thu, 21 Dec 2006 15:50:34 +0000

I noticed this performance hit also with z3. I tried to move the
z3:encode-init procedure outside the loop but it failed. I am
interested in doing some performance comparisons of JSON, XML etc when
compressed so would also like to try out some alternative methods.


On 21/12/06, Graham Fawcett <address@hidden> wrote:
Hi folks,

Does anyone have code for compressing strings using zlib, lzo or some
other common llibrary/algorithm? I seem to have z3 working, but the
performance is really terrible -- I may be doing something wrong, to
be fair, the documentation is a bit light.

I'm getting ~20x better performance using (process "gzip") to fork
gzip and compress that way, compared with a string-compression
procedure that I copied from the z3 test-script:

address@hidden:/tmp$ csi -s compress-test.scm

Test expression is: (string-length (gzip-string *input*))
Evaluating once. Return value is: 29
Repeating 1000 times:
    0.71 seconds elapsed
    0.02 seconds in (major) GC
   43478 mutations
      10 minor GCs
      16 major GCs

Test expression is: (string-length (z3:compress-string *input*))
Evaluating once. Return value is: 19
Repeating 1000 times:
   17.48 seconds elapsed
    5.11 seconds in (major) GC
   16888 mutations
       3 minor GCs
    1667 major GCs

I know it's not an apples-to-apples comparison. But why the huge
difference? My lack of understanding of the z3 egg may be the cause.

Insights (or code!) are welcome. If you have code, I'm really just
looking for anything that will compress fairly-regular strings
(s-expresions) of length 2K-20K, with speed favoured over compression.
I'm considering wrapping LZO but time is a bit short.

Thanks very much,

Chicken-users mailing list

reply via email to

[Prev in Thread] Current Thread [Next in Thread]