How to grep a group of files within a time range?

14,052

Solution 1

Some of the fine things find (on GNU/Linux) can do for you:

Units:

  • n exactly n untis
  • -n less than n units
  • +n more than n units -

What happened:

  • -atime: last time accessed
  • -ctime: changes on file itself (permissions, owners, …), not its content
  • -mtime: file's content changed
  • -amin n: n minutes age
  • -atime n: n days (24 hours) ago
  • same goes for ctime/min and mtime/min)

Thus:

  • find -atime -30 → last accessed less than 30 days ago
  • find -ctime +5 → more than 5 days ago, changes on file itself
  • find -mtime +2 -31 → file's content changed more than two days but less than 31 days ago

also - -daystart: after today, 0.00h


Grepping

find stuff -exec grep {} \; →; the last part ({} \;) is essential - mind the single white space between {} and \;

The -exec options allows incorporating other commands into find


Also: Why one shouldn't parse the output of ls

Solution 2

find -iname "" -mtime -7 -exec zgrep "" {} \;

For ex-

find /opt/WebSphere/AppServer/profiles/application/logs/ -iname "SystemOut*" -mtime -7 -exec zgrep "FileNotFoundException" {} \;

It will find within directory /opt/WebSphere/AppServer/profiles/application/logs/ for files starting with SystemOut in last 7 days and will look for String FileNotFoundException.

Share:
14,052

Related videos on Youtube

Steve
Author by

Steve

Updated on September 18, 2022

Comments

  • Steve
    Steve over 1 year

    I'm trying to write a script used on a buffer box that does full packet capture of network traffic. As it's for a fairly big network we split the captures into 100MB segments. At times of high network traffic oftentimes over a one minute period we will have multiple pcaps which cover that period.

    So what I want to do is have a bash script that lets the analyst who is searching for something specify a date and time and how many minutes either side of it they want to search for files. Obviously I can do something like this -

    ls -al | grep "Dec  1" | grep 02:00
    ls -al | grep "Dec  1" | grep 02:01
    

    and so on, get each result and grep each file individually for the specific keyword I'm looking for, but I'd like to be able to do a wider search for all files created within a time range and then grep each of them for the keyword.

    I'm not entirely sure how to do that, any help would be appreciated.

    • Eric
      Eric over 10 years
      The find command is your friend, especially the -ctime, -mtime and/or -newer options. Do a man find for more details.
    • Steve
      Steve over 10 years
      Yeah I've looked into using find, but that only works in days. We have 100% capture on the network for 30 days, averaging maybe, I dunno... thousands of 100mb segments per day. Doing searches over days of packets will put unnecessary strain on an already heavily loaded system.
    • Eric
      Eric over 10 years
      Culled from man find: -ctime n[smhdw] s second m minute (60 seconds) h hour (60 minutes) d day (24 hours) w week (7 days) So you can specify units
    • Steve
      Steve over 10 years
      I need to look for files in a RANGE of times on a specific date. Say between 11:20pm and 11:30pm on 3rd November.
    • MaQleod
      MaQleod over 10 years
      Here is a method using find: aaronbonner.io/post/28969404367/… (just omit the delete portion of course).
    • glenn jackman
      glenn jackman over 10 years
      I would use stat and date to find the files
    • Scott - Слава Україні
      Scott - Слава Україні over 10 years
      If your version of find supports the -newer test with a filename parameter, use touch to create files whose modification date/times are at the beginning and end of your desired range, and then use the expression -newer file2 ! –newer file1.