bug-gnu-utils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Awk Error - Help SOught Urgently


From: Bob Proulx
Subject: Re: Awk Error - Help SOught Urgently
Date: Wed, 25 Aug 2004 10:05:32 -0600
User-agent: Mutt/1.3.28i

kahrs wrote:
> Rahul Gopal Joshi wrote:
> > about 5000 files, one after another, with each file containing about 200000
> > records. WHen I run the awk code, for batches of about 100 -150 files, it
> > runs fine., But when I try to run in on the entire list of 5000 files, it
> > breaks in between and gives the following error . 
> 
> It is possible that this is not a gawk problem.
> Having 5000 files you will face completely new kinds of problems
> with your shell. Try this one:
> 
>  ls * > /dev/null
> 
> Looks like an innocent and trivial command, but the command
> will fail with an error message on many platforms if you have
> thousands of files. Why ? It is the shell which tries to expand
> the * and fails because the list is too long. This is not an
> academic objection. 3 years ago, I saw someone going mad because
> he tried to process 11000 (containg some Genome sequences) this way.

The 'xargs' program was designed specifically to avoid this problem.
So as a hint that if the ARG_MAX limitation is the problem then you
can avoid it this way.

Instead of:

  awk 'some awk program' *

Try this instead:

  find . -type f -print0 | xargs -r0 awk 'some awk program'

Bob




reply via email to

[Prev in Thread] Current Thread [Next in Thread]