[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: bit-split, or: the schizophrenia of trusted computing

From: Michal Suchanek
Subject: Re: bit-split, or: the schizophrenia of trusted computing
Date: Mon, 1 May 2006 22:57:06 +0200

On 5/1/06, Marcus Brinkmann <address@hidden> wrote:
At Mon, 01 May 2006 14:55:40 -0400,
"Jonathan S. Shapiro" <address@hidden> wrote:
> On Mon, 2006-05-01 at 20:30 +0200, Marcus Brinkmann wrote:
> >
> > > I will go further: in the absence of OS support, such violations cannot
> > > (in general) even be *detected*, so the suggestion that their can be
> > > deferred to social or legal enforcement actually means that you are
> > > declaring that these types of encapsulation can be violated without any
> > > human consequence at all -- or at least that the possibility of such a
> > > violation with serious human consequence places the problem domain, by
> > > definition, outside of the applications that are "of interest to the
> > > Hurd".
> >
> > I can't parse that paragraph.
> Sorry. Let me try to explain.
> If we say "mechanical prevention has other bad consequences, so we will
> leave problem X for social enforcement" we have a problem. In order for
> social enforcement to actually occur, we must be able to detect that the
> undesirable action X actually occurred. Any means of detecting this is
> necessarily built on top of enforcing primitives that are used "softly".
> So if we say that we wish to remove those primitives, we are saying that
> it is not important to be able to detect the undesired actions X. In
> consequence, we are declaring that they are not important enough to
> deserve "enforcement" in the social sense either.

I don't believe that this follows.  It is, at the very least, a
simplification.  First of all, if the undesired action X has
consequences for anybody but the violator, then it may be detectable
by these consequences.  This is not necessarily the case, but it is
also an important consideration.

In civil cases, it may even be necessary for the plaintiff to
demonstrate that these consequences occured to be eligible for any

In fact, society allows, quite intentionally, for some violations to
go unnoticed.  In Germany, there is a saying, "Wo kein Klaeger, da
keine Klage." (Where there is no plaintiff, there is no lawsuit.)
This includes both violations that go by undetected, as well as
violations which are detected by ignored by the possible plaintiff.

I am not claiming that this is necessarily the ideal world to be in.
However, one should be extremly cautious in suggesting to disturb a
balance that is the result of thousands of years of popular struggle.
Any such change must be well motivated, and the effects have to be
demonstrated before implementing such a change.

An expert in law could surely fill us in with some interesting history
here.  In any case, however, I think that one can safely state that
society rejects a system where every single violation is automatically

Oh, I guess that this is exectly where DRM fails. Plus it is planned
that it imposes even more restrictions than neccessary (because it is
technically impossible to do what is written in the law).

detected and prevented.  I think the scare terms here are totalitarian
control and police state.

No, totalitarian control and police state are probably more about
'recording every single violation so that it does not have to be
fabricated when you want to get rid of the person legally'. But
totalitarian control is also about doing anything to anyone without
regard to any law.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]