coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Faster ls when there are thousands of files in a directory


From: Peng Yu
Subject: Re: Faster ls when there are thousands of files in a directory
Date: Sat, 25 Jun 2011 07:32:31 -0500

On Sat, Jun 25, 2011 at 12:54 AM, Jim Meyering <address@hidden> wrote:
> Peng Yu wrote:
>> When there are a few thousands of files/directories in a directory
>> that I want to ls, I experience long wait time (a few seconds on mac).
>> I'm wondering if some kind of cache can be built for ls to speed it
>> up? Note my ls is installed from macport (not the native mac ls).
>
> Use "ls -1U" (efficient with coreutils-7.0 or newer) or find.

If I use -1U with -ltr, I see the results are still sorted. What does
ls do internally with "-1U" for speedup?

> Someday GNU ls will use fts, and then it will benefit from
> the inode-sorting fts does for some FS types when there are
> very many files.  Then it will be faster with additional
> combinations of options.  But even then, it won't beat "ls -1U",
> which doesn't call stat at all for FS with useful dirent.d_type.


What is fts?

-- 
Regards,
Peng



reply via email to

[Prev in Thread] Current Thread [Next in Thread]