Argument list too long for ls
Solution 1
Your error message argument list too long comes from the * of ls *.txt
.
This limit is a safety for both binary programs and your Kernel. See ARG_MAX, maximum length of arguments for a new process for more information about it, and how it's used and computed.
There is no such limit on pipe size. So you can simply issue this command:
find -type f -name '*.txt' | wc -l
NB: On modern Linux, weird characters in filenames (like newlines) will be escaped with tools like ls
or find
, but still displayed from *. If you are on an old Unix, you'll need this command
find -type f -name '*.txt' -exec echo \; | wc -l
NB2: I was wondering how one can create a file with a newline in its name. It's not that hard, once you know the trick:
touch "hello
world"
Solution 2
It depends mainly on your version of the Linux kernel.
You should be able to see the limit for your system by running
getconf ARG_MAX
which tells you the maximum number of bytes a command line can have after being expanded by the shell.
In Linux < 2.6.23, the limit is usually 128 KB.
In Linux >= 2.6.25, the limit is either 128 KB, or 1/4 of your stack size (see ulimit -s
), whichever is larger.
See the execve(2) man page for all the details.
Unfortunately, piping ls *.txt
isn't going to fix the problem, because the limit is in the operating system, not the shell.
The shell expands the *.txt
, then tries to call
exec("ls", "a.txt", "b.txt", ...)
and you have so many files matching *.txt
that you're exceeding the 128 KB limit.
You'll have to do something like
find . -maxdepth 1 -name "*.txt" | wc -l
instead.
(And see Shawn J. Goff's comments below about file names that contain newlines.)
Solution 3
Another workaround:
ls | grep -c '\.txt$'
Even though ls
produces more output than ls *.txt
produces (or attempts to produce), it doesn't run into the "argument too long" problem, because you're not passing any arguments to ls
. Note that grep
takes a regular expression rather than a file matching pattern.
You might want to use:
ls -U | grep -c '\.txt$'
(assuming your version of ls
supports this option). This tells ls
not to sort its output, which could save both time and memory -- and in this case the order doesn't matter, since you're just counting files. The resources spent sorting the output are usually not significant, but in this case we already know you have a very large number of *.txt
files.
And you should consider reorganizing your files so you don't have so many in a single directory. This may or may not be feasible.
Solution 4
This might be dirty but it works for my needs and within my competency. I don't think it performs very quickly but it allowed me to get on with my day.
ls | grep jpg | <something>
I was getting a 90,000 long list of jpgs and piping them to avconv to generate a timelapse.
I was previously using ls *.jpg| avconv before I ran into this issue.
Solution 5
MAX_ARG_PAGES appears to be a kernel parameter. Using find
and xargs
is a typical combination to address this limit but I'm not sure it'll work for wc
.
Piping the output of find . -name \*\.txt
to a file and counting the lines in that file should serve as a workaround.
Related videos on Youtube
zahypeti
Updated on September 18, 2022Comments
-
zahypeti almost 2 years
I get the following error when trying to
ls *.txt | wc -l
a directory that contains many files:-bash: /bin/ls: Argument list too long
Does the threshold of this "Argument list" dependent on distro or computer's spec? Usually, I'd pipe the result of such big result to some other commands (
wc -l
for example), so I'm not concerned with limits of the terminal.-
Admin about 12 yearsThat counts as parsing
ls
's output, which is a bad idea, so better avoid it. For counting see What's the best way to count the number of files in a directory?, for a tricky workaround see why for loop doesn't raise “argument too long” error?. -
Admin about 12 years@manatwork Yes, I saw those questions, too. Just wondering a better way to use or redirect a long output from a command in a more general fashion.
-
Admin over 8 yearsyou can use getconf ARG_MAX to get the limit on most unix based systems
-
Admin over 4 yearsTo count the files that match a pattern, use
set -- *.txt; printf 'There are %s .txt files\n' "$#"
instead. This would give you the correct count regardless of whether a name contains newlines.
-
-
manatwork about 12 yearsYou can do anything with
ls
's output, will not solve this. As long as the *.txt wildcard is expanded over the limit, will fail before even startingls
and generating any output. -
Bram about 12 yearsTrue, I've updated my answer.
-
manatwork about 12 yearsBetter. But to make it a replacement for
ls
you should specify-maxdepth 1
to avoid recursively scanning the subdirectories. -
Coren about 12 years@ShawnJ.Goff I have tested it. There's no need of ` echo` in current version of GNU find
-
Shawn J. Goff about 12 years@Coren @Mikel - not everybody has GNU's
find
. Thefind
on OS X and on busybox-based systems, and I would guess any BSD-based system shows the filename with a newline in it, which would mess with the count. -
Mikel about 12 yearsHuh?
wc -l
is counting newlines. So we want it to have newlines. -
Mikel about 12 yearsOh, I see your point. You don't want to print the file name at all, in case the file name itself contains a newline. Fair enough.
-
Heath Borders about 12 yearsSorry for not able to upvote an answer. Need more reputation. :(
-
Jacob Lindeen about 12 yearsSorry for not able to upvote an answer. Need more reputation. :( Thank you all!!
-
Guilherme Salomé about 7 yearsCould you explain what the
.
and the-maxdepth 1
mean in the last line? Thanks! :D -
Mikel about 7 years@GuilhermeSalomé
.
means current directory,-maxdepth 1
means it doesn't look in subdirectories. This was intended to match the same files as*.txt
. -
blak3r over 5 yearsMac OSX doesn't like -type, this works though:
find . -name '*.txt' | wc -l
-
roaima over 4 years@blak3r but
mkdir my_favourites.txt