What's the best way to count the number of files in a directory?

23,407

Solution 1

How about this trick?

find . -maxdepth 1 -exec echo \; | wc -l

As portable as find and wc.

Solution 2

With bash, without external utilities, nor loops:

shopt -s dotglob
files=(*)
echo ${#files[@]}

In ksh, replace shopt -s dotglob by FIGNORE=.?(.). In zsh, replace it by setopt glob_dots, or remove the shopt call and use files=(*(D)). (Or just drop the line if you don't want to include dot files.) Portably, if you don't care about dot files:

set -- *
echo $#

If you do want to include dot files:

set -- *
if [ -e "$1" ]; then c=$#; else c=0; fi
set .[!.]*
if [ -e "$1" ]; then c=$((c+$#)); fi
set ..?*
if [ -e "$1" ]; then c=$((c+$#)); fi
echo $c

Solution 3

find . ! -name . -prune -print | grep -c /

Should be fairly portable to post-80s systems.

That counts all the directory entries except . and .. in the current directory.

To count files in subdirectories as well:

find .//. ! -name . | grep -c //

(that one should be portable even to Unix V6 (1975), since it doesn't need -prune)

Solution 4

Try:

ls -b1A | wc -l

The -b will have non-printable characters, -A will show all files except . and .. and one per line (the default on a pipe, but good to be explicit).

As long as we're including higher-level scripting languages, here's a one-liner in Python:

python -c 'import os; print len(os.listdir(os.sep))'

Or with full 'find':

python -c 'import os; print len([j for i in os.walk(os.sep) for j in i[1]+i[2]])'

Solution 5

Have you considered perl, which should be relatively portable?

Something like:

use File::Find;

$counter = 0;

sub wanted { 
  -f && ++$counter
}

find(\&wanted, @directories_to_search);
print "$counter\n";
Share:
23,407

Related videos on Youtube

rahmu
Author by

rahmu

Updated on September 18, 2022

Comments

  • rahmu
    rahmu over 1 year

    If parsing the output of ls is dangerous because it can break on some funky characters (spaces, \n, ... ), what's the best way to know the number of files in a directory?

    I usualy rely on find to avoid this parsing, but similarly, find mydir | wc -l will break for the same reasons.

    I'm working on Solaris right now, but I'm looking for a answer as portable across different unices and different shells as possible.

    • Admin
      Admin over 9 years
      I'm not sure it's a duplicate, am I missing something?
    • Admin
      Admin almost 9 years
      This might be a duplicate, but not of the question indicated. find will get you number of files recursively (use -maxdepth 1 if you don't want that. find mydir -maxdepth 1 -type f -printf \\n | wc -l should handle the special characters in the filename, as they are never printed in the first place.
  • Lekensteyn
    Lekensteyn over 12 years
    THis won't work for hidden files either unless the shell is configured to expand those with *.
  • clerksx
    clerksx over 12 years
    This doesn't work (it displays n+1 files on my Debian system). It also doesn't filter for regular files.
  • Nikhil Mulley
    Nikhil Mulley over 12 years
    gnu find . -maxdepth 1 -type f | wc -l
  • rahmu
    rahmu over 12 years
    I like this trick, very clever; but I'm surprised there's no simple straightforward way to do that!
  • enzotib
    enzotib over 12 years
    @Rush: this command should never raise "arg list too long". That only happens with external command (so never with for.
  • Arcege
    Arcege over 12 years
    @ChrisDown the OP doesn't specify filtering for regular files, asks for number of files in a directory. To get rid of the n+1 issue, use find . -maxdepth 1 ! -name . -exec echo \; | wc -l; some older versions of find do not have -not.
  • Stéphane Chazelas
    Stéphane Chazelas almost 9 years
    Note that -maxdepth is not standard (a GNU extension now also supported by a few other implementations).
  • nisetama
    nisetama almost 8 years
    The first example prints 1 for an empty directory when nullglob is not enabled. In zsh, a=(*(DN));echo ${#a} with the N (nullglob) qualifier does not result in an error for an empty directory.
  • rahmu
    rahmu over 6 years
    You might run into problems if the file name contains a \n or other funky chars (yeah, certain unices allow this).
  • Josh Yang
    Josh Yang over 6 years
    I tried this explicitly before posting my answer and had no problems with it. I used nautilus file manager to rename a file to contain \n to try this.
  • Josh Yang
    Josh Yang over 6 years
    You'r right it doesn't work like that. I don't know what I did when I tested this first. Tried again and updated my answer.
  • xhienne
    xhienne over 6 years
    No, the command is OK, but there is already a similar solution and hidden files are not counted.
  • xhienne
    xhienne over 6 years
    One of the rare portable answers on this page, if not the only one.
  • Anthony Geoghegan
    Anthony Geoghegan over 5 years
    I upvoted this answer yesterday as I found it also works well for directories other than the current directory (find dirname ! -name dirname -prune -print). I have since been wondering if there's any particular reason to use grep -c / instead of wc -l (which is probably more commonly used for counting).
  • Stéphane Chazelas
    Stéphane Chazelas over 5 years
    find dirname ! -name dirname doesn't work if there are other directories within that are named dirname. It's better to use find dirname/. ! -name .. wc -l counts the number of lines, file names can be made of several lines as the newline character is as valid as any in a file name.