coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Faster ls when there are thousands of files in a directory


From: Jim Meyering
Subject: Re: Faster ls when there are thousands of files in a directory
Date: Sat, 25 Jun 2011 07:54:51 +0200

Peng Yu wrote:
> When there are a few thousands of files/directories in a directory
> that I want to ls, I experience long wait time (a few seconds on mac).
> I'm wondering if some kind of cache can be built for ls to speed it
> up? Note my ls is installed from macport (not the native mac ls).

Use "ls -1U" (efficient with coreutils-7.0 or newer) or find.

Someday GNU ls will use fts, and then it will benefit from
the inode-sorting fts does for some FS types when there are
very many files.  Then it will be faster with additional
combinations of options.  But even then, it won't beat "ls -1U",
which doesn't call stat at all for FS with useful dirent.d_type.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]