bug-findutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: ARG_MAX


From: James Youngman
Subject: Re: ARG_MAX
Date: Wed, 25 Apr 2007 23:23:21 +0100

On 4/25/07, address@hidden <address@hidden> wrote:

Dear James,

  I just took a look at your comments in buildcmd.c.

Thanks.  I've copied the address@hidden mailing list, since
we're discussing design changes.      I hope you don't mind.

(Background for the benefit of list members: there are operating
systems - including commercial Unix systems - on which the
compile-time value of ARG_MAX is not a reliable guide as to how long a
command line we can use.  In the absence of a well-defined answer at
compile time, we will have to behave appropriately in xargs and find
if it comes to pass that exec() fails due to argument length limits.
My plan (and part of Leslie's SOC proposal) is to respond to this by
finding a lower, working, value of ARG_MAX when this happens.  The
comments Leslie refers to are in lib/buildcmd.c in the trunk of the
CVS findutils code.).


  Is it really desirable to intertwine trial & error ARG_MAX detection
into it?  A separate detection via binary search *before* any actions
are carried out would be much cleaner IMHO.  It would not be as
expensive, either, as it would run in logarithmic time, as opposed to
{linear plus overhead} time.


My main concern is the overhead of figuring out the ARG_MAX value.  To
find out the prevailing maximum argument length limit by binary
search, in the case where we start off with an assumption of 128K,
might take as many as 17 calls to exec, of which 8 could be
successful.  There is a nontrivial amount of processing there.

This overhead is especially undesirable in the common case, where the
total number of arguments sent to exec (by xargs or find) is in fact
less than the limit anyway.   So I'd prefer not to expend the
additional effort unless we have to.  Specifically, it would be good
if we could come up with a design where we spend no up-front effort if
our static, compile-time, guess is correct.

There is, in fact, some ambiguity in the precise interpretation of
ARG_MAX limit anyway.  The POSIX standard says:-

"The number of bytes available for the new process' combined argument
and environment lists is {ARG_MAX}. It is implementation-defined
whether null terminators, pointers, and/or any alignment bytes are
included in this total."

So it's slightly unclear how we count up to ARG_MAX anyway.   If
pointers are also included in ARG_MAX, the calculation of how much of
that space we have "used" so far is different anyway.

  So, are there any compelling reasons for doing so?

To be honest, it mostly comes down to performance.

James.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]