axiom-developer
[Top][All Lists]

## Re: [Axiom-developer] Zero divisors in Expression Integer

 From: Waldek Hebisch Subject: Re: [Axiom-developer] Zero divisors in Expression Integer Date: Thu, 4 Jan 2007 15:46:56 +0100 (CET)

```William wrote:
> >I have already written that due to incomplte
> >simplification we
> >may get zero divisors in Expression Integer.  Below an
> >easy
> >example that multiplication in Expression Integer is
> >nonassociative
> >(or, if you prefer, a proof that 1 equals 0):
> >
> >(135) -> c1 := sqrt(2)*sqrt(3*x)+sqrt(6*x)
> >
> >            +--+    +-+ +--+
> >    (135)  \|6x  + \|2 \|3x
> >                                                      Type:
> >Expression Integer
> >(136) -> c2 := sqrt(2)*sqrt(3*x)-sqrt(6*x)
> >
> >              +--+    +-+ +--+
> >    (136)  - \|6x  + \|2 \|3x
> >                                                      Type:
> >Expression Integer
> >(137) -> (1/c1)*c1*c2*(1/c2)
> >
> >    (137)  1
> >                                                      Type:
> >Expression Integer
> >(138) -> (1/c1)*(c1*c2)*(1/c2)
> >
> >    (138)  0
> >                                                      Type:
> >Expression Integer
>
> But this is not just an Axiom problem. Mathematica does
> the same thing, with a slight variation on input:
> a1 = Sqrt*Sqrt[3 Sqrt[5x + 7] + 6] - Sqrt[6Sqrt[5x + 7]
> + 12]
> a2 = Sqrt*Sqrt[3 Sqrt[5x + 7] + 6] + Sqrt[6Sqrt[5x + 7]
> + 12]
> (1/a1)*a1*a2*(1/a2)  (* answer 1 *)
> (1/a1)*(a1*a2 // Simplify)*(1/a2)  (*answer 0, Simplify is
> needed to get this *)
>

Well, if you do simplification by hand, then my example give
the same result in Maple, Maxima and giac.  But AFAICS
Axiom situation is worse, because the simplification is
done automatically (apparently the other systems calculate in
a ring without zero divisors and only simplify the final answer,
which is sound).  Also, Axiom may automatically introduce
roots even if original input contained none.  Anyway, I think
that in 2007 we should do better.

> The problem seems to be the lack of a canonical form for
> radical expressions and an algorithm to reduce expressions
> to canonical form. A related problem is lack of algorithm
> to test zero. Another is denesting of a nested radical
> expression. These problems have been studied by Zippel,
> Landau, Tulone et al,  Carette and others.
>

I think that the core problem is that roots in computer algebra
are really ill-defined.  For well-defined non-nested roots already
in 1971 Fateman gave resonable solution.  General well-defined
roots can be resolved via factorization in algebraic extensions
(factorization is expensive, but seem to work reasonably well in
Axiom).

More precisely, one possibility is that say sqrt(x+1) is just
a solution of the equation y^2 - x - 1 = 0.  Analytically,
then sqrt(x+1) is a multivalued analytic function with a
single finite branch point at x = -1 (and the second branch
point at infinity).  Algebraically, P(y) = y^2 - x - 1 is
an irreducible polymomial over Q(x) and we get a well defined
quadratic extension.  Large part of Axiom assumes such an
interpretation.

If we look at sqrt(2)*sqrt(3*x)-sqrt(6*x) we see that
P(y) = y^2 - 6*x splits over Q(x, sqrt(2), sqrt(3*x)).  To
have a field we should either (more or less arbitrarily)
choose one of roots (factors) or reject the input or split
computation into two branches (introduce provisos).

There is an extra difficulty: traditional numerical definitions
of elementary functions have branch cuts.  Branch cuts means
that we no longer have analytic functions and we may get
zero divisors even if algebraic approach would produce unique

Branch cuts turn decidable problem into undecidable one:

Claim1: Let A be an algebra of complex functions closed under +, -, *
and composition.
Assume that A contains rational constants, pi, x, sin, and sqrt (with
branch cut along negative half-axis).  Then the following questions are
undecidable:

1) is an element of A zero for all x?
2) is real part of an element of A always positive?

Proof: Let phi(t) = sqrt(sqrt(t^4)) and s(x)=sin(pi*x_1)*...*sin(pi*x_n).
Consider multivariate functions of the
form f = phi(P_1(x))+phi(P_2*s(x))+phi(s(x)) - 1. Like in Richardson
proof for any multivariate polynomial Q we can make an f which
has real part positive if and only if Q has no integral zero
(Richardson used more complicated Q, but thanks to Matiasevitch it
is enough to consider polymomial Q). Then we get the conclusion the
same way as Richardson.

Remark: We can use phi(t) - t to produce zero divisors in the algebra A
(that is why I omitted division from the signature).

----------

Claim2: Let K be a differential field with field of constants k.
Assume that K = k(u_1, ..., u_n) is given as a tower of extensions
such that each u_i satisfies one of the following:

1) P(u_i) = 0 where P is an irreducible polynomial over
k(u_1,...,u_(i-1))
2) D u_i = (Df) u_i with f in k(u_1,...,u_(i-1))
3) D u_i = (Df)/f with f in k(u_1,...,u_(i-1))

in cases 2 and 3 we assume that u_i is transcendental over
k(u_1,...,u_(i-1)).  Assume also that k is effectively computable.
Then K is effectively computable and given an equation for u of the form
D u (Df) u or D u (Df)/f with f in K there is an algorithm which either
finds out an algebraic extension of K containinig u solving the
equation of proves that adding transcendental u satifying the equation
does not add new constants.  We can also decide if a given
polynomial is irreducible over K.

Proof:  The claim is just a restatement of Risch stucture theorem
plus known (and implemented in Axiom) algorithms for computation
in algebraic extensions.

-----

Claim2 means that computation with elementary functions in algebraic
setup may be done effectively as long as we do not get into trouble
due to constants (and if Schanuel conjecture is true also computation
with constants is effective).  In practice, regardless of Schanuel
conjecture we may use floating point computations to prove that
a constant is non-zero or to get an indication that with high
probability it is indeed zero.

Coming back to Axiom expressions: the theoretical basis for Expression
domain is a theorem that any finitly generated field K has form:

K = k(u_1, ..., u_m, u_{m+1},..., u_n)

where k is a prime field, u_1, ..., u_m are algebraicaly independent
(so k(u_1, ..., u_m) is just the field of rational functions) and
u_{m+1},..., u_n correspond to a tower of algebraic extensions. In
particular, we should know the minimal polynomial of u_l over
k(u_1, ..., u_{l-1}) for l = m+1, ..., n.

So, one way to solve zero divisor Expression problem is to rewrite
all kernels in terms of a set of algbraically independent kernels
and a set of "independent" algebraics.  We can use Risch structure
theorem and factorization in algebraic extensions to find out
such independent sets as long as we limit ourselfs to elementary
functions.  Unfortunatly, the situation with special functions
in much less clear: AFAIK there is no known comprehensive structure
theorem but also no known obstacles to derive such a theorem.

If we limit ourselfs to functions defined by linerar ODEs we can
effectively solve zero equivalence problem.

So as alternative solution we can do divisions only if we can prove
that divisor is non-zero.  In most cases such proof could be done
by rather cheap floating point computation (but ATM Axiom have
very weak support for numeric computation of special functions).

I am not sure how well we want to support branch cuts.  I belive
that in most practical cases functions are analytic, for example
sqrt(exp(%i*x)) is just exp(%i*x/2) (while branch cut interpretation
would produce artifical discontinuities).  In other work "correctly"
handling branch cuts means solving hard problem giving result
which probably does not match with user expectations...

OTOH for numeric computation branch cuts seem to be accepted
solution.  Using one definition for symbolic computations
and a different one for numeric computations breaks one
of fundamental expectations (namely, that evaluating functions
at a point is a homomorphizm).

--
Waldek Hebisch