Deleting many files results in "argument list too long"
Solution 1
A typical way to handle the “argument list too long” error is via the find
command:
find -maxdepth 1 -mindepth 1 -type f -name "*.jpg" -delete
Solution 2
You can use xargs
:
printf '%s\0' *.jpg | xargs -0 rm --
In bash
, the printf
command is a built-in and is not subject to the same argument length limitations.
Related videos on Youtube
j0h
been using Linux since 2005. Ubuntu since whenever edgy eft was new. Lucid Lynx Ubuntu was the best Ubuntu I have ever used.
Updated on September 18, 2022Comments
-
j0h over 1 year
I have 280 thousand photos to delete in a folder, but some videos to keep. In the folder, I gave the command:
#rm *.jpg
, but I get "argument list too long". When I create an argument to delete some of the photos, it works on a smaller set, like this: # rm 104-*.jpg.How can I efficiently delete all the JPEG files in a directory without getting the message "Argument list too long"?
#rm -f *.jpg
gives the same message.Opening the folder in Caja uses too much memory and crashes. I am using Ubuntu MATE.
-
Sergiy Kolodyazhnyy almost 6 yearsAlternatively, using
./*.jpg
instead of relying on--
. Makes it more portable -
j0h almost 6 yearsThis is a slow way to do what I already did, prior to steeldriver's answer. N-*.jpg extends from 0 to 198 or something like that. I thought about just writing a for loop to increment N in rm *N-N-N.jpg. but thought there had to be a single efficient command to do such an operation.
-
Sergiy Kolodyazhnyy almost 6 years@j0h As far as the
104-*.jpg
part goes, I apologize for that, I thought that's a pattern for all the images. But this is a single command, which doesn't call anything external. Compared to steeldriver's command - two commands which also are slowed down by a pipe ( which implies buffering ) - I don't see how this single command is slow. -
j0h almost 6 yearsah it might not be, as a single command, just the pattern match. Im tempted to let the same issue happen again just to test the solutions.
-
Sergiy Kolodyazhnyy almost 6 years@j0h I guess we're on the same topic then :) I'm working on an answer that should generate enough arguments that will get below or above the threshold to cause
Argument list too long
error intentionally. I might post a link once I figure this out. -
j0h almost 6 yearsIm thinking that around 10,000 items is the area that generates that error.
-
Sergiy Kolodyazhnyy almost 6 years@j0h Actually, it's not related to number of items at all. It's the total size in bytes of the names that get passed to the command.
-
Sergiy Kolodyazhnyy almost 6 years@j0h I posted an answer. Also, please read the linked Kusalananda's answer as well. Basically, it's difficult to estimate exact number of files.
-
j0h almost 6 yearsfind: warning: you have specified the -mindepth option after a non-option argument -type, but options are not positional (-mindepth affects tests specified before it as well as those specified after it). Please specify options before other arguments
-
Sergiy Kolodyazhnyy almost 6 years@j0h The -mindepth has to come before -type. I'll edit shortly
-
j0h almost 6 yearsThis command failed in a later attempt, with : sudo: unable to execute /usr/bin/printf: Argument list too long .... I think it may have succeeded previously, because i had been actively removing items. prior to using the command. Sergiy's answer worked, without any file removal, so my answer has changed.
-
steeldriver almost 6 years@j0h this answer works when
printf
is a bash shell builtin command; if you run it withsudo
then (as you can see from the error message),sudo
will invoke the external/usr/bin/printf
- which is subject to the same argument length restrictions as any other command.