[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: bit-split, or: the schizophrenia of trusted computing

From: Lluis
Subject: Re: bit-split, or: the schizophrenia of trusted computing
Date: Mon, 1 May 2006 16:45:54 +0200
User-agent: Mutt-ng devel-r796 (based on Mutt 1.5.11/2005-09-15)

El Mon, May 01, 2006 at 05:36:10AM +0200, Marcus Brinkmann ens deleità amb les 
següents paraules:
> You seem to ask me if I have a general proof that all uses of a TC chip 
> are harmful.  I don't have such a proof.  I have, for me personally, 
> decided that all use cases that I know about (and I have done an 
> extensive search) are harmful, if they have any broader impact at all 
> (there are some use cases which don't appear to be very harmful, but they 
> also have no broader impact, so who cares?).
> I have also given some thoughts on why this may be the case.  But this 
> is not a proof.  You will have to make up your own mind.  If you find 
> a use case you are interested in, maybe you want to submit it to the 
> challenge I posted.

First of all, I don't know if what i'll describe is part of a non-trivial 
confinement mechanism, but I think so, and i would advocate for it, 
although I'm absolutely in favour of free software.

Background: I'm an avid reader of many sorts of literature, but one of that 
I like is hard science-fiction (have you ever read Greg Egan? ;)).

Being all this nonsense said, I sometimes dream (on the times when I'm 
mentally worse ;)) of ubiquitious computing, with an informational and 
calculation cloud following each user.

In order to support launching computations on the "environment" (the 
surrounding computers), I think two (in fact one) "things" should be 

1) The host machines doing the computations, that I do not own, should be 
   able to do my computations without being able to inspect neither leak my 
   information (as it could be potentially personal and used "against" me)

2) To be sure that they conform on that "unleak" propierty, some sort of 
   DRM, TCPA or whatever system should be used, as what I understood from a 
   past thread about this (although I didn't follow it completely), is that 
   this kind of hardware lets a chain of confidence on software to be 
   built, so we can be sure that no leak of information will happen

So, a system like this should have:
  a) a way to build a chain of confidence on the software it runs (through  
     a DRM-like hardware), based on software that
  b) provides the property that no information will be leaked ever to other 
     entities other than me

This last can be accomplished either by providing means of "non-trivial 
confinement" (I mean, that nobody except me will be able to inspect that 
program nor its results), or by an assurance that no information will be 
ever leaked outside the machine, but that would lead to complete isolation 
of the machine, rendering it unusefull on this use-case.

So I think that unforgeable digital signature of the running system 
(provided by a DRM-like hardware) and the possibility to run "opaque" 
processes is the only way to support this.

Whether if this use-case is usefull or not, is another theme ;)

Hope I explained it clearly.

 "And it's much the same thing with knowledge, for whenever you learn
 something new, the whole world becomes that much richer."
 -- The Princess of Pure Reason, as told by Norton Juster in The Phantom
 Listening: Pierre Fournier (JS Bach - Suites For Unaccompanied Cello) - 32. 
Suite 6 (D Major) - Allemande

reply via email to

[Prev in Thread] Current Thread [Next in Thread]