[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: du --files-from feature request

From: Aaron Peterson
Subject: Re: du --files-from feature request
Date: Sat, 6 Dec 2008 14:05:18 -0800

I had no idea that this specifically was a security risk.

I do remember one of my friends explaining how other characters could
be used as seporators, and that if I have files with dashes in them
and do a * that the shell expands that it tries to interpret it as an

It looks like the documentation has been made more n00b friendly...
so this should help doc readers very much.

I am convinced that your choice to not do  --files-newline-from  is
the right choice considering the environment.

Padraig had this solution to one of the problems I found on the debian site;

>     # wc could not be used to count NUL-separated items
>     # (another more complex program would be needed)

tr -d '\n' | tr '\0' ''\n' | wc -l
which works,  but really ----
How many system admins are going to know to nuke the  \n's could be
part of a filename, or care if they are just putting together a quick
script. (that always seems to end up going into production because of
pointy haired bosses)  If they had no prior knoweldge and do just the
tr from \n to \0 for du, they still get the security vulnerability.
(because the hacker makes files with \n as characters in the really
big file and the du command doesn't count it, so the hacker manages to
fill the disk triggering some sort of unhanded exception..)

Is there already standard that is more end user friendly than posix
for general usage, one where "advanced" knowledge isn't required to
maintain security?

Example, I can live without newline being an allowed filename
character (actually I don't know how lowering the number of allowed
bytes would  work with multibyte characters)

and I also tend to find case sensitivity annoying.


Ultimately, I find it annoying that more options are needed than this:
   find "*want*" | du
For the utilities to work properly they would need to know their
context and I believe that everybody agrees that it's not the
utilities job to get that complicated.

I'm wondering how hard it would be to have a new shell  that     would
use context to interpret the command like this:
find -iname "*want*" -type f -0 | du --files0-from -

There are certain "default' behaviors that could be reasonably
assumed, and when two things are put together, certain interfaces are
applied sufficiently to be automatically calculated. --- (basically If
I can ask on a forum, and people can tell my why my choice was
insecure, or wouldn't work with certain filenames, or needs to be
formatted a certain way, that the shell can do that before it executes
the command.

I'm actually a bit excited by this idea,
 I'm guessing that reusing / missusing existing well defined utility
names in this shell would be a very bad thing (TM)
so I'd have to come up with new commands that get interpreted as the
existing very well thought out utilities.

people are often not even using disks, so disk usage isn't as general
as it could have been.
Media usage --(mu) could be an example of a command in this shell that
would expand to du with appropriate options.
find could be replaced with ff

On Sat, Dec 6, 2008 at 7:06 AM, Eric Blake <address@hidden> wrote:
> Hash: SHA1
> According to Aaron Peterson on 12/6/2008 2:16 AM:
>> Wow, You guys seem to really care.
>>  6.10 is the version of du  that I'm using. (Ubuntu Ibex)
> 6.12 is the latest stable version, and 7.0 is also available for testing.
>> So, I understand that newline /CR is a complicated way to delimit
>> lists  lists because of system variations, but did somebody tackle
>> this by determining what the (possibly multi byte) newline sequence is
>> and make a --from-files work?
> We tend not to worry too much about CR as line terminators, because that
> is not the POSIX way.  The coreutils are not consistent on how CR is
> handled.  On the other hand, we are not adverse to easy-to-maintain
> patches that makes life more portable on platforms with CR problems (for
> example, I've had a low-priority item on my todo list to escape CR as \r
> in md5sum output).
> But back to your question about recognizing newline-separated lists: using
> newline to delimit file lists is inherently insecure, because newline is a
> valid filename character.  Someone can intentionally name a file
> $'/tmp/oops\n/bin', and if a careless sysadmin does
> 'ls /tmp | xargs rm -rf', then they just nuked /bin.  We don't have any
> plans on adding --from-files that takes newline separated entries, because
> there is no point adding security holes.  The only valid option is
> - --from-files0 (although, the way getopt works, you can use unambiguous
> abbreviations, so --from-files is shorthand for --from-files0, meaning
> that you still plan on using NUL terminators rather than newline terminators).
> - --
> Don't work too hard, make some time for fun as well!
> Eric Blake             address@hidden
> Version: GnuPG v1.4.9 (Cygwin)
> Comment: Public key at home.comcast.net/~ericblake/eblake.gpg
> Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
> bngAn1XrEv4EiZkcwsbxzkAiMRp8nPj2
> =ufvP

reply via email to

[Prev in Thread] Current Thread [Next in Thread]