axiom-developer
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Axiom-developer] Re: [sage-devel] Re: doctest failures due to rounding


From: Tim Daly
Subject: [Axiom-developer] Re: [sage-devel] Re: doctest failures due to rounding errors on Solaris.
Date: Fri, 01 Jan 2010 00:13:41 -0500
User-agent: Thunderbird 2.0.0.21 (Windows/20090302)

Dr. David Kirkby wrote:
rjf wrote:
On Dec 31, 11:15 am, "Dr. David Kirkby" <address@hidden>
wrote:

RJF
The point you are missing is that we want to compare the output what Sage prints
to a human.

The point you are missing is that the following item, which presumably
could be printed by Sage,
is perfectly readable to a human:

6121026514868073 * 2^(-51).

It exactly dictates the bits in an IEEE double-float, and does not
require any conversion from binary
to decimal. It does not need rounding.  This kind of representation
does not have any hidden unprinted digits.  It does not ever need to
be longer because of delicate edge conditions of certain numbers.

It happens to evaluate to
APPROXIMATELY   2.718281828459045

Sure, Sage could print that. It would also be worth printing the sign bit, so we could verify the values of

1) Sign bit
2) Significand
3) Exponent.

All of those could be correct. But there is still the software which does the non-trivial task of converting that into the base 10 representation used by humans. Then in additon to that, there is the software which takes a base 10 number, shows it with the Sage prompt, adding carriage returns etc where necessary. All of these can go wrong.

I would think in an almost ideal world, the test would be done at a higher level, using hardware/software which checked what the monitor actually displayed. That's not quite as easy to do though.

Even better would be some way to scan the brain of the user to see what he/she believes Sage is showing. Perhaps we use a font that is not very good, so despite being displayed properly, it misunderstood.

Given most of time people want to see a base 10 representation of a number, and not a base 2, base 16 or IEE 754 representation, I believe most testing should be done at the base 10 level.

If there is a reason for testing the IEEE 754 representation as first choice, then you have yet to convince me of it.


Dave


Dave,

Axiom has the same issues.

My take on this is that what you check depends on the reason you are checking. If you are generating the output for human use (e.g. a table) then you want decimal. If you are generating the output for regression testing (e.g. checking the answers on
multiple hardware) then you probably want Fateman's solution.

Tim






reply via email to

[Prev in Thread] Current Thread [Next in Thread]