[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: finding directories with many files in a file system
From: |
Pádraig Brady |
Subject: |
Re: finding directories with many files in a file system |
Date: |
Fri, 26 Jul 2013 13:49:48 +0100 |
User-agent: |
Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130110 Thunderbird/17.0.2 |
On 07/22/2013 10:43 AM, Bernhard Voelker wrote:
> On 07/19/2013 01:11 AM, Philip Rowlands wrote:
>> This gives the non-cumulative total per directory:
>> $ find . -xdev -printf '%h\n' | sort | uniq -c | sort -n
>>
>> but doesn't handle hard links. You could use -printf '%h %i\n' and
>> post-process the duplicate inodes (no per-file stat since find v4.5.4).
>
> Thanks.
> I had a look into src/du.c and adding such an --inodes option
> was pretty easy, see attached patch.
>
> Comments?
>
> P.S. I must admit that the (unusual) corner case of listing the
> inodes usage of the working directory when it is deleted is not Zero:
>
> $ d=$(pwd)/d ; mkdir $d; cd $d; rmdir $d
> $ stat -c "%h %n" .
> 0 .
> $ du --ino .
> 1 .
>
> Have a nice day,
> Berny
>
This patch is simple and probably makes sense as a complement to df -i.
Why not support --threshold BTW?
Could --threshold be an inode count rather than byte count with --inodes?
thanks,
Pádraig.