[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: security of the emacs package system, elpa, melpa and marmalade

From: Stephen J. Turnbull
Subject: Re: security of the emacs package system, elpa, melpa and marmalade
Date: Sat, 28 Sep 2013 02:31:05 +0900

Matthias Dahl writes:

 > > Well, sure.  A concrete block is inherently more secure against
 > > an earthquake than a building.  That doesn't mean we should
 > > replace the latter with the former.
 > Stephen, I'm not advocating we should all drive around in an
 > armored car or never ever connect our computers with the evil
 > internet or whatever.

It's an exaggeration.  See below.

 > All I am saying is: It would be very helpful if we could give the
 > user a few tools to handle, grasp and maybe harden certain security
 > aspects.  And in this concrete discussion: It is all about plugins
 > who, once they are installed through whatever means, can also do
 > whatever they choose.

Sure.  *Preventing* that is going require doing something that is
probably impossible for any program that isn't an operating system in
control of the machine.

 > You wouldn't work as root on your system, would you?

I do every day, to run emerge --update. ;-)

 > And why should a plugin get full rights if just needs a few infos
 > from the local buffer?

It shouldn't.  But that question is not interesting and the answer
isn't controversial.  The interesting question is, "why should a
plugin be denied the rights it needs when those go beyond reading the
buffer it was invoked from?"

 > But the reality is, we have to use software that others created. And the
 > open source/free software world is full of great minds and talents that
 > create astounding pieces of software. And those people working pouring
 > the time and life into those projects, usually would never place
 > any malicious code into their creations. It is through hacks or other
 > circumstances that such things happen. The world is not inherently
 > evil.

True.  The world is not.  In fact, most of the bad guys aren't evil,
just willing to bend the rules a bit to get their way.  Still, cracked
is cracked, and it only takes once, no matter what the ratio of
great|talented|astounding is to "warped".  And that's why the issue
here is that the answer to the "interesting question" is the same one
that a mother gives to a 5-year-old: "just because".  The way to get
the necessary permission in general is to ask the user each time the
program wants access.

 > I did not know that the Python devs worked on a sandbox, honestly.

They didn't just work on it; they had one (the restricted execution
option) and then they stopped distributing it because it didn't keep
its promises.  There's been work on a better one, but IIRC it hasn't
been PEP'ed yet.  And nobody except the author (who is very good, I
admit) has really tried to "break" it or use it in production.

 > But the problem here is a bit more "relaxed", imho. We are not talking about
 > hardening / sandboxing a language in general but only a very concrete
 > functionality in a specific program (which, granted, is very tightly
 > intervened with the language it is written in).

No, it's *not* a concrete functionality.  The concrete functionality
is "Shall the plugin be installed?"  The answer to that is easy to

But if you're talking about preventing an untrusted plugin from doing
"evil" things, you need to accompany every call to a sensitive
function with some way to determine whether the function should be
executed or not.  That is a sandbox, but you're welcome to call it by
a different name if you like.

 > [Refusing to install untrusted code] may reduce the risk but is
 > this really a solution?

I believe there is no solution.  Security as we understand it today is
about preventing some entities from accessing the functionality of
certain other entities.  The more security, the less functionality can
be accessed.  You can't get both to 100%.

 > Say you use only the great jedi.el for your Python development. I
 > am sure that its author Takafumi Arakaki would never put anything
 > harmful in it... but I can imagine several scenarios how something
 > harmful could end up in it nevertheless without him noticing it for
 > a while.

Sure.  But the chances are pretty good that I would.  Anyway, the
definition of "absolutely need" is "I'm willing to bet that I or some
other user would catch it even if the author doesn't."

There's another answer based on the details of your example.  I avoid
doing development on exposed hosts.  In one sense that's unfair, but
in another it goes to the heart of the matter.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]