How can I quickly find the 20 most recently modified files on Mac OS X (instead of using Python)?
7,730
Solution 1
On Mac OS X (10.10.2), try this
find . -xdev -type f -print0 | xargs -0 stat -f "%m%t%Sm %N"
or run a stat
directly
stat -f "%m%t%Sm %N" /path
From man stat
In order to determine the three files that have been modified most recently, you could use the following format: > stat -f "%m%t%Sm %N" /tmp/* | sort -rn | head -3 | cut -f2- Apr 25 11:47:00 2002 /tmp/blah Apr 25 10:36:34 2002 /tmp/bar Apr 24 16:47:35 2002 /tmp/foo
You could easily replace 3 with 20 (-:
Solution 2
find . -xdev -type f -printf "%T@ %Tc %p\n" | sort -n | tail -20
Related videos on Youtube
![ijt](https://i.stack.imgur.com/rp6uy.png?s=256&g=1)
Author by
ijt
Updated on September 18, 2022Comments
-
ijt almost 2 years
Of course it's possible to take the output of
find . -type f | xargs ls -l
and pipe it to a Python script that would sort the lines and output the top 20.But is there anything faster than that?
-
DaleHarris541 over 9 yearsPass
-mtime
or-mmin
tofind
to reduce the number of entries the Python script has to parse and sort. -
Admin over 9 yearsSomeone down-voted the question. Why?
-
DaleHarris541 over 9 yearsProbably because this question is not unique to professional server administration; an end user might ask the same thing.
-
Admin over 9 yearsI see, so this question should have been asked on stackoverflow.
-
peterh over 9 yearsWith brew (brew.sh) you can use good gnu tools.
-
-
Jay Wick over 9 yearsThat might work on Linux, but on Mac OS X, it says
find: -printf: unknown primary or operator
. -
DaleHarris541 over 9 yearsTry to trim the amount of data being piped to poor overworked
sort
. -
peterh over 9 years@ijt On OSX, with "brew" you can install usable gnu tools.
-
Jay Wick over 9 yearsWhich brew package has GNU find?
brew install find
turns up nothing. -
ijt over 9 yearsThis works:
time find . -xdev -type f -print0 | xargs -0 stat -f "%m%t%Sm %N" | sort -rn | head -n 20 | cut -f2-
, but it took 2 minutes, 11 seconds, so it's not quick. -
KM. over 9 yearsYou are searching the entire file system, so yes, it will take time. What is your definition of quick?
-
ijt over 9 yearsI'm looking for under 50msec.
-
Hoov almost 8 yearsThis is how many files you can scan with different disk types in 50ms on paper, assuming 10ms, 50000ns, 20000ns and 10ns access times and ignoring overhead: HDD 5 files, SATA SSD 1000 files, PCIe SSD 2500 files, RAM disk 50 million files. Considering the numbers a RAM disk seems to be your only option if you absolutely need that speed. An alternative might be to cache the results.
-
Mark Setchell almost 3 years
brew install findutils
should do the trick.