My current "line of attack" on this research is to try to prove the
GCD algorithm in NonNegativeInteger.
While this seems trivial in proof systems it is expensive to
compute from the inductive definition. While this seems
trivial in computer algebra, the implementation code lacks
There are several steps I'm taking. I'm creating a new compiler
that handles Spad code with several new additions. The new
language (and the effort in general) is called "Sane".
One step is to make sure the new compiler generates code
that runs in Axiom. This is challenging as there is almost no
documentation about Axiom internals so it all has to be
Next is the addition of "axioms" to the categories, such as
adding axioms for equality to BasicType, where equality is
defined to include reflexive and transitive properties.
(Equality, by the way, is probably THE most controversial
topics in logic, c.f. the univalence axiom in HoTT). These
axioms decorate the category signatures and are inherited.
Next is the addition of axioms to domains, also decorating
the signatures with axioms.
Next is the logical specification of properties of the data
type implementation of the domain, called the REP in
Axiom and the "carrier" in logic. For example, think of a
binary tree REP and what properties you can guarantee.
Next is adding a specification for the operations that
implement the signatures. These are specific to each
function that a domain implements.
Next is decorating code with pre- and post-conditions
as well as loop invariants and storage invariants.
Next the collection of all of the above is cast into a form
used by a proof system (currently Lean) that implements
dependent types (Calculus of Inductive Construction).
Next a proof is constructed. The resulting proof is attached
to the Axiom code. Proof checking is wildly cheaper than
proof construction and the new Sane compiler would
perform proof checking at compile time for each function.
So if there is a breaking change somewhere in the tower
the proof would fail.
Challenges along the way, besides reverse-engineering
the Spad compiler, include adding languages for stating
axioms, for stating REP properties, for stating function
specifications, for stating program properties, and for
stating proof certificates. The pieces all exist somewhere
but they are not necessarily compatible, nor well defined.
Is this all possible to do? Well, of course, as this is all
"just mathematics". Do *I* know how to do this? Well,
of course not, which is what makes this a research effort.
Ultimately I'm trying to build an instance of merging proof
and computer algebra at a very deep, proven level. Think
of it as a PhD thesis project without the degree incentive :-)
On 6/25/19, Martin Baker <address@hidden> wrote:
On 6/25/19, William Sit<address@hidden> wrote:
> The expression x = x + 0, whether treated as a type or an equation,
> can only make sense when x, =, + and 0 are clearly typed and defined.
> It makes little sense to me that this, as an equation, can be "proved"
> to be valid universally (that is, without the definition of, say +).
If x is a natural number defined like this in Coq:
Inductive nat : Set := O : nat | S : nat -> nat
then x = x + 0 is not an axiom but is derivable.
Of course this all depends on the structures and definitions, I didn't
mean to imply that it is valid universally.
On 25/06/2019 19:28, Tim Daly wrote:
The "elegant way" that Martin is questioning is the problem
of combining a certain kind of logic operation (refl aka
"reflection" where both sides are equal) with the notion of
a mathematical unit.
I think that refl (short for "reflexivity" of = relation), is the usual
abbreviation for the only constructor of an equality type in Martin-Lof
I get the impression that this theory is very powerful in proof
assistants and I am questioning if you proposing to build this into
Axiom and how?