[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [Axiom-developer] RE: Bootstrapping

From: C Y
Subject: RE: [Axiom-developer] RE: Bootstrapping
Date: Thu, 10 Nov 2005 07:42:52 -0800 (PST)

--- Bill Page <address@hidden> wrote:

> On November 10, 2005 3:12 AM Andrey G. Grozin
> > 
> > On Wed, 9 Nov 2005, C Y wrote:
> > > Years ago Ken Thompson proposed a diabolical attack on a 
> > > computer that could be made by putting a trap door in a
> > > compiler, which would automatically build it into all software
> > > and subsequent versions of itself, undetectibly.  (I think this
> > > is the article: That kind
> > > of thing makes people (especially open source folk, I think)
> > > suspect all binaries, and for good reason.
> It must be the approaching Winter season or maybe it is this pain
> in my back that wont go away anymore, but I seem to be disagreeing
> with almost everyone here lately... :(

Didn't somebody respond to that situation once with "How can I be
agreeable when everybody is wrong?" ;-)
> I believe that such an attack is technically possible, but I 
> disagree strongly that therefore there is a good reason to suspect
> all binaries. Modern network computing (like life in general) is
> a social phenomenon. In social interactions it is extremely
> important that one establish relationships based on trust.

The problem is you have to trust not only the person you are getting
the binary from, but that their computer is secure and the files
haven't be replaced, that the original computer it was compiled on was
uncompromised, that the binaries used on that computer to make the
binary you are downloading don't contain any earlier backdoors they can
pass on...

>From a point of fact, I don't think any true (e.g. human) bootstrapping
has occurred within recent history, so in a sense all software today
relies on some unknown binaries from the past.  This sounds like a good
project for the government, actually - establish open software and
hardware standards to create a system that, with the minimum necessary
work from a human being, can start the binary chain reaction going.  Of
course even if I somehow propose that to my congressman I'll probably
just get a weird look...

> It is only by trusting others that is possible to build a co-
> operative collaborative environment that is more than the sum of
> it's parts.

True, de facto.  But it would be nice if, at least in theory, it were
possible for some group or organization to bootstrap from the beginning
using open techniques, software, and hardware.  Then we at least know
who we are trusting, and at need that bootstrap could be reproduced
elsewhere and compared.

Actually, that sounds like something the FSF might be good for... 

> Given the aggressive and competitive nature of people, companies
> and governments, no doubt this might seem a little naïve to some
> people, but trust me, it is not ... :)

I'm reminded of some incident a while back where Microsoft was quietly
sticking unique identifying numbers in every word document...  they got
caught and issued a patch to stop it:

If they did it once...

The problem with any binary is it involves a web of trust reaching all
the way back to the beginnings of the first binary that created it.  We
don't even know WHO we are trusting, so there is no way to evaluate
whether we should be trusting them.  I think some academic work has
been done on "web of trust" issues, but I'm not really familiar with

If we were to establish a clear, documented trail of binary history
back to a few original bootstrapping events, with GPG signatures,
identical files produced between different entities every step along
the way, and full documentation of all tools and hardware used then I
agree that's probably workable.

> The implementation of trusted computing on the Internet is
> already quite well advanced. Many binary programs are available
> with electronic signatures that guarantee authenticity and
> origin.

But how far back?  Binaries were used to create the signed binaries,
after all.

> Yes, any system (at least those in common use now) can
> be broken, but we trust these people, e.g. the GNU free software
> foundation, or for that matter even the Axiom developers, not to
> behave in a malicious manner.

Right, but in software you're trusting every binary ever in the chain
of software production.  AFAIK that is NOT well documented, anywhere.

> No matter what we do technically,
> in the end security always comes down to trusted relationships,
> from computer to computer, computer to human, and human to human.

Right.  However, knowing nothing about anybody beyond the immediate
developers and distributors we are dealing with, we can't judge for
ourselves where our web of trust should end.  I agree the Axiom
developers are a good crowd, but that's only the tip of the iceberg.

> > Yes. I dislike having any binaries in my system I have not
> > compiled myself. Therefore, I use Gentoo (installed from stage
> > 1, so I recompiled gcc too). Of course, this does not help
> > against the Thompson's attack.
> By arguing in favour of bootstrapping, I am certainly *not* arguing
> against the idea of compiling as much open source software from
> source as possible - from the kernel up. I think that such an
> approach does effectively deter Thompson's attack (but not prevent)
> because at least in principle the possibility of comparing the
> source to the generated binary does exist.

I agree it does deter it, but someone needs to certify that the binary
being used to bootstrap gcc doesn't have any backdoors in it.  I'll
agree it probably doesn't, as a point of fact, but we cant' KNOW that
unless the original source -> unoptimized binary event by human beings
is documented and recorded, and then the full compiler binary history
from that point to the current bootstrap binary.  That's the full web
of trust, and a full documentation of that is what is needed for
informed decisions.

> > > Not in light of things like Ken Thompson's proposed attack. 
> > > Security people may be paranoid, but on the internet paranoia
> > > is a virtue.
> No. Paranoia is a disease, like depression. It is a social/medical
> condition that needs to be treated.

OK, technically true.

> Security is another thing all together. It consists of using the
> right technology, having a clear understanding of the way the
> system works, and establishing trusted relationships. Security
> is not a matter of hiding knowledge and hording control.

The problem is, how can you make an intelligent decision to trust when
you have no knowledge of the entity you are trusting?

> > As one of my colleges said,
> > 
> > For a sysadmin, the absence of paranoia is called professional 
> > incompetence.
> I think your colleague does not have a clear understanding of
> security.

I think the above statement is essentially shorthand acknowledgement of
the following:

a)  when a computer is connected to the internet, anyone in the world
can launch an attack against it.
b)  given the number of people on the net, there will be bad actors. 
the statistical chances of this being the case approach unity quite
c)  there is a significant, non-zero chance that my machine will come
under attack
d)  any part of my system not personally verified by myself is an
unknown, and I cannot state with certainty that it contains no
e)  any binary not created on site from source code cannot be
inspected, and therefore is only as trustworthy as the ENTIRE web of
trust behind its creation.
f)  since functionality must be provided, the best available measures
are to reduce the required bootstrapping binaries to a minimum,
reducing the number of potential problems.  Hence, the appeal in the
open source world of only relying on gcc for bootstrapping events - if
a problem is ever found, that makes only ONE piece of software that has
to be redone the "hard way".  Then everything else can be rebuilt

So, if Axiom (at least SOME version of Axiom) can be built with just a
working GCL, GCL can be built by gcc.  So Axiom only relies on gcc for
bootstrapping, which is a common reliance in the open source world.

> > Sorry for off-topic.
> Andrey, I think that although this might be a side-issue, it
> is not really off-topic since as open source developers we do
> distribute both binaries and source code for Axiom. And I
> think we should take some steps that we are not taking now to
> help ensure that what we distribute is trusted by Axiom users.

I agree here - signed binaries might be a very good idea.   I'll admit
I don't know too much about that myself (I admit I'm too trusting) but
its definitely the "right" way to do things.  I'll have to look into
what the modern ideas on that issue are.


Yahoo! Mail - PC Magazine Editors' Choice 2005

reply via email to

[Prev in Thread] Current Thread [Next in Thread]