guile-devel
[Top][All Lists]

 From: Carl Witty Subject: Re: scm_i_fraction_reduce thread safety Date: 11 Dec 2003 11:19:02 -0800

```On Thu, 2003-12-11 at 03:43, Bill Schottstaedt wrote:
> On the need for reduction, I wonder how much of that
> notoriety is due to Knuth -- his discussion of this
> was in a slightly artificial context, and I've been
> intending for a long time to check some "real-world"
> situation.  (I think it's less of a problem than
> calling gcd all the time).

I don't have any data on this (merely uninformed opinions), but I'd be
nervous about not reducing fractions.  If you always reduce, I think
that multiplies the cost of operations by about log(n) (where n is the
number of bits in the numerator or denominator) (assuming you use
Euclid's gcd algorithm, instead of something asymptotically faster).  On
the other hand, if somebody has an algorithm that builds up fractions
which are almost always reducible, then not reducing can make the
algorithm slower by an arbitrary amount.

For instance, I recently had a problem of coming up with a simple
formula (to find the volume of an n-dimensional simplex, if you're
curious), where part of the problem was to compute
(1/2)*(2/3)*(3/4)*(4/5)*...*(n/(n+1)).  When you're working out the
formula, it's easy to recognize that this is 1/(n+1).  But if I had just
wanted a few answers, rather than a general formula, I might have
written a little program and ended up with fractions of the form
n!/(n+1)!, which would be highly inefficient to work with for large n.

I think that one way to avoid arbitrarily-bad behavior is to reduce only
if the numerator or denominator is "sufficiently large" (I think this is
at most a constant factor worse than always reducing).  For instance, we
could reduce fractions only if the numerator or denominator is a bignum,
in hopes that we will get back to fixnum-sized numbers.

Carl Witty

```