Passing output from one command as argument to another

12,403

Solution 1

As a sidenote, tail displays the last 10 lines.

A possible solution would be to grepthis way:

for i in `ls -lf access.log*`; do grep $(tail $i |awk {'print $4'} |cut -d: -f 1| sed 's/\[/\\[/') $i > $i.output; done

Solution 2

why don't you break it up into steps??

for file in *access.log
do
  what=$(tail "$i" |awk {'print $4'} |cut -d: -f 1)
  grep "$what" "$file" >> output
done

Solution 3

You shouldn't use ls that way. Also, ls -l gives you information you don't need. The -f option to grep will allow you to pipe the pattern to grep. Always quote variables that contain filenames.

for i in access.log*; do awk 'END {sub(":.*","",$4); print substr($4,2)}' "$i" | grep -f - $i > "$i.output"; done

I also eliminated tail and cut since AWK can do their jobs.

Share:
12,403
w00t
Author by

w00t

Updated on June 04, 2022

Comments

  • w00t
    w00t almost 2 years

    I have this for:

    for i in `ls -1 access.log*`; do tail $i |awk {'print $4'} |cut -d: -f 1 |grep - $i > $i.output; done
    

    ls will give access.log, access.log.1, access.log.2 etc.
    tail will give me the last line of each file, which looks like: 192.168.1.23 - - [08/Oct/2010:14:05:04 +0300] etc. etc. etc
    awk+cut will extract the date (08/Oct/2010 - but different in each access.log), which will allow me to grep for it and redirect the output to a separate file.

    But I cannot seem to pass the output of awk+cut to grep.

    The reason for all this is that those access logs include lines with more than one date (06/Oct, 07/Oct, 08/Oct) and I just need the lines with the most recent date.

    How can I achieve this?

    Thank you.

  • Fred Foo
    Fred Foo over 13 years
    This, too, overwrites output in every iteration.
  • w00t
    w00t over 13 years
    I guess I can break it up, but wanted to run it from CLI. Anyhow, It's been more than one time that I needed to use a feature like this, so I'm trying to find the answer for my scenario.
  • w00t
    w00t over 13 years
    I don't quite care about this aspect. I could have used tail -n 1.
  • w00t
    w00t over 13 years
    I don't see how xargs can help me. It would need to know what the output from all the prior commands is and then pass it as a pattern to grep. It would not be command | xargs grep 123 file.txt, but it would be command | xargs grep -pattern-from-command- file.txt
  • w00t
    w00t over 13 years
    @mouviciel - I already tried it exactly as you wrote it, but it gives grep: Unmatched [ or [^
  • mouviciel
    mouviciel over 13 years
    I see, [ has a special meaning for grep. I edit my answer.
  • w00t
    w00t over 13 years
    sed 's#\[##g' - this one works. Nonetheless, I'm still curious about my initial question
  • w00t
    w00t over 13 years
    It seems that this is the only solution, to write all the other commands inside the one you need. So, in my case, it is: for i in `ls -lf access.log*`; do grep $(tail $i |awk {'print $4'} |cut -d: -f 1| sed 's#[##g') $i > $i.output; done. Thanks for the support.
  • tchrist
    tchrist over 13 years
    The output from ls -l is a big long line; you do not want to pass that whole thing to tail!
  • w00t
    w00t over 13 years
    why not? you think that tail is gonna get sick? fine, ls -1 instead of -l. this wasn't actually the problem at hand.
  • w00t
    w00t over 13 years
    I see no problem with ls -l, it's how I've got used to use it. I could have used ls -1, but it is so not important. AWK isn't a good candidate for tail and cut's job because AWK reads the whole file to get to the END, and when one has 500+mb of files, it takes too long. Now about the grep -f, that really is helpful. Thanks.
  • tripleee
    tripleee over 7 years
    mywiki.wooledge.org/ParsingLs explains why ls is problematic.
  • tripleee
    tripleee over 7 years
    The elegant solution is to redirect the done to overwrite the file; this way, the entire loop body is redirected, and the file is opened for writing just once, intead of repeatedly inside the loop.