l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Part 2: System Structure


From: Jonathan S. Shapiro
Subject: Re: Part 2: System Structure
Date: Thu, 18 May 2006 14:32:17 -0400

On Thu, 2006-05-18 at 15:36 +0200, Marcus Brinkmann wrote:
> At Thu, 18 May 2006 13:24:01 +0200,
> Bas Wijnen <address@hidden> wrote:
> > The hospital case is very special.
> 
> The hospital case can be summarized like this: "The law requires the
> implementor to implement DRM.  How do you do it?"  Well, if that is in
> fact true, then the right question to ask is not "How do we do it?"
> but: "Is the law the right law?"

The answer to this seems fairly clear: this law (and its international
equivalents) is a direct response to serious violations of patient
privacy that occurred because management of data was not adequately
controlled against insider (user) misbehavior.

You may argue (and I would agree) that the current law is too complex.
This is not the fundamental issue. The fundamental issue is that there
are many incentives for hospital personnel to "cheat", and the cost to
the patient of recovering from such an illegal disclosure is very very
high (in some cases, recovery is *impossible*). It has been demonstrated
by existing practice that there is no practical means to police this
behavior using social mechanisms. It has also been demonstrated that the
courts and the social process of recovery is so expensive and so
ineffective (even when the patient wins) that preemptive protection of
this information is justified.

> The hospital example has another problem: It is far too complex.  I do
> not have the resources required to study it.  I can not find out what
> the law says, nor how it came into existence, what its likely effects
> are, and what the opportunity costs are in its design.  The best I
> could try is to find experts on the matter and consult them, but even
> that is a time consuming process.

Real requirements are often found only in real problems. Conversely,
simplistic scenarios can usually be satisfied with simplistic solutions.
The problem is that these simplistic solutions do not meet reality on
favorable terms.

Marcus: I don't agree with your view of the ethics of information
control, but I do see that it is a legitimate point of view to explore.
>From a system architecture perspective, my objection is that I consider
it excessively simplistic, and my experience leads me to believe that a
system built in this way will be acceptable in real-world conditions.

And my problem with this is that building an OS is a *huge* undertaking,
and because of this I feel that it is immoral for *me* to entice a large
group of people to commit effort to something that I feel will not work.

You obviously do not share my belief, and it is perfectly okay for *you*
to pursue this until something causes your belief to change (if it
does).

However, this type of exploratory investment using the lives of
contributors demands, in my view, an ethical commitment to continuous
testing of beliefs (both yours and mine). Rejecting real-world problems
because they are too inconvenient to understand is a form of willful
ignorance that is inconsistent with this commitment.

In this case, you are being a bit short-sighted. You don't *need* to
understand the details of the law. The summary is sufficient. What the
law says, in essence, is that computational systems used in many medical
applications must avoid disclosing information to unauthorized parties,
and must exercise reasonable standards of diligence in enforcing this.
The definition of "authorized" is contextual, and is determined
primarily by checking who is providing care to which patient, and the
role of that individual. Because there have been serious thefts of
medical information by insiders, the standards of "reasonable diligence"
include defending effectively against deliberate "insider" attempts to
access unauthorized information.

When the best technology available is UNIX, many unavoidable errors are
acceptable as "reasonably diligent". When a better technology is
available, failing to use it appropriately constitutes lack of adequate
diligence under the law. A system running Hurd -NG (as currently
conceived) could not satisfy the test of reasonable diligence in this
context.



shap





reply via email to

[Prev in Thread] Current Thread [Next in Thread]