[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [bug #31961] Out-of-control memory usage when run against a large di

From: Alex
Subject: Re: [bug #31961] Out-of-control memory usage when run against a large directory
Date: Fri, 24 Dec 2010 18:08:29 -0200

Hi James,

The directory has no subdirectories. I have been unable to cound the number
of files in the directory because I can't do the usual find . | wc -l

I just tried the latest stable release 4.4.2 downloaded from the official
site and compiled from source and ran into the same problem.

Will try oldfind.

I know for a fact that the opendir call takes a long time to get a response
from the system. My PHP scripts only starts outputting anything after a
couple of minutes. I don't know why find would be using up memory while it
waits for the response.

- Alexandre

On Thu, Dec 23, 2010 at 6:46 PM, James Youngman <address@hidden>wrote:

> Follow-up Comment #1, bug #31961 (project findutils):
> The sizes of the files (as oppposed to directories)  is of course
> irrelevant;
> find never opens them, let alone reads them.
> What is the depth of the directory hierarchy under this directory?     How
> many entries are there in each?  Do you get the same characteristics if you
> use the "oldfind" binary that's also generated when you build find (from
> source)?
> Also, findutils-4.4.0 is quite old now, since it was released on
>  2008-03-15;
> could you try 4.4.2 (from ftp.gnu.org) or 4.5.9 (from alpha.gnu.org)?
>    _______________________________________________________
> Reply to this item at:
>  <http://savannah.gnu.org/bugs/?31961>
> _______________________________________________
>  Message sent via/by Savannah
>  http://savannah.gnu.org/

reply via email to

[Prev in Thread] Current Thread [Next in Thread]