How to use grep with large (millions) number of files to search for string and get result in few minutes

13,797

Solution 1

You should remove -0 argument to xargs and up -n parameter instead:

... | xargs -n16 ...

Solution 2

It's not that big stack of files (kudos to 10⁷ files - a messys dream) but I created 100k files (400 MB overall) with

for i in {1..100000}; do head -c 10 /dev/urandom > dummy_$i; done

and made some tests for pure curiosity (the keyword 10 I was searching is chosen randomly):

> time find . | xargs -n1 -P8 grep -H "10"
real 0m22.626s
user 0m0.572s
sys  0m5.800s

> time find . | xargs -n8 -P8 grep -H "10"
real 0m3.195s
user 0m0.180s
sys  0m0.748s

> time grep "10" *
real 0m0.879s
user 0m0.512s
sys  0m0.328s

> time awk '/10/' *
real 0m1.123s
user 0m0.760s
sys  0m0.348s

> time sed -n '/10/p' *
real 0m1.531s
user 0m0.896s
sys  0m0.616s

> time perl -ne 'print if /10/' *
real 0m1.428s
user 0m1.004s
sys  0m0.408s

Btw. there isn't a big difference in running time if I suppress the output with piping STDOUT to /dev/null. I am using Ubuntu 12.04 on a not so powerful laptop ;) My CPU is Intel(R) Core(TM) i3-3110M CPU @ 2.40GHz.

More curiosity:

> time find . | xargs -n1 -P8 grep -H "10" 1>/dev/null

real 0m22.590s
user 0m0.616s
sys  0m5.876s

> time find . | xargs -n4 -P8 grep -H "10" 1>/dev/null

real m5.604s
user 0m0.196s
sys  0m1.488s

> time find . | xargs -n8 -P8 grep -H "10" 1>/dev/null

real 0m2.939s
user 0m0.140s
sys  0m0.784s

> time find . | xargs -n16 -P8 grep -H "10" 1>/dev/null

real 0m1.574s
user 0m0.108s
sys  0m0.428s

> time find . | xargs -n32 -P8 grep -H "10" 1>/dev/null

real 0m0.907s
user 0m0.084s
sys  0m0.264s

> time find . | xargs -n1024 -P8 grep -H "10" 1>/dev/null

real 0m0.245s
user 0m0.136s
sys  0m0.404s

> time find . | xargs -n100000 -P8 grep -H "10" 1>/dev/null

real 0m0.224s
user 0m0.100s
sys  0m0.520s
Share:
13,797

Related videos on Youtube

Watt
Author by

Watt

Updated on June 04, 2022

Comments

  • Watt
    Watt almost 2 years

    This question is related to How to use grep efficiently?

    I am trying to search for a "string" in a folder which has 8-10 million small (~2-3kb) plain text files. I need to know all the files which has "string".

    At first I used this

    grep "string"
    

    That was super slow.

    Then I tried

    grep * "string" {} \; -print
    

    Based on linked question, I used this

     find . | xargs -0 -n1 -P8 grep -H "string"
    

    I get this error:

    xargs: argument line too long
    

    Does anyone know a way to accomplish this task relatively quicker?

    I run this search on a server machine which has more than 50GB of available RAM, and 14 cores of CPU. I wish somehow I could use all that processing power to run this search faster.

  • Watt
    Watt over 10 years
    Yes, it is too many files in a folder. Can you please elaborate your solution on how search the "string" faster?
  • Watt
    Watt over 10 years
    +1 Thanks! it worked. I will wait for few other responses before accepting this as an answer.
  • Mark Setchell
    Mark Setchell over 10 years
    Sadly I don't have a solution yet... I am still trying to understand the parameters of the question. What OS are you using? What filesystem are you using? Have you tried running the "find" command on its own and timing it? time find . | wc -l
  • Watt
    Watt over 10 years
    OS: Ubuntu (latest version) . Find returned result within 2 seconds
  • phs
    phs over 10 years
    -n64 or -n128 might a more realistic number.
  • Watt
    Watt over 10 years
    is n the number of processes concurrently be used?
  • Drey
    Drey almost 7 years
    man xargs shows [-n max-args] and max processes is -P flag: [-P max-procs]