[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: finding directories with many files in a file system

From: Bernhard Voelker
Subject: Re: finding directories with many files in a file system
Date: Fri, 19 Jul 2013 00:09:41 +0200
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130329 Thunderbird/17.0.5

On 07/18/2013 05:55 PM, Joseph D. Wagner wrote:
> On 07/18/2013 2:25 am, Bernhard Voelker wrote:
>> I have e.g. a file system where most of the inodes space is used,
>> let's say 450k of 500k. What command(s) would I use to find out
>> which sub-directories are eating most of the inodes?

> It is sometimes hard to find a generic solution to a filesystem 
> specific problem.

I don't think it's *that* file system specific because many UNIX
file systems have a fixed limit for the maximum number of inodes.
It could be e.g. ext[2-4].

In the meantime, I wrote a little script which determines the
mount point of the given file/directory argument (or "." if missing),
and for every sub-directory (which is on the same device), it counts
the number of files in the tree below.  It's far from being perfect,
and as it uses find(1) for each sub-directory it is quite suboptimal
regarding performance, of course.

Have a nice day,

Description: application/xz

reply via email to

[Prev in Thread] Current Thread [Next in Thread]