How can I curl the output of another command
7,869
Use xargs
.
xargs utility [argument ...]
The xargs utility reads space, tab, newline and end-of-file delimited strings from the standard input and executes
utility
with the strings as arguments.
There are more parameters and options than in this shortened form, of course.
A general example using curl
:
$ echo "http://www.google.com" | xargs curl
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>302 Moved</TITLE></HEAD><BODY>
<H1>302 Moved</H1>
The document has moved
<A HREF="http://www.google.de/">here</A>.
</BODY></HTML>
In your specific case, it'd look similar to the following:
./jspider.sh http://www.mypage.com | grep 'resource' | awk '{print $4}' | xargs curl | grep myString
Related videos on Youtube
Author by
Admin
Updated on September 18, 2022Comments
-
Admin over 1 year
I want to pass curl the output from awk
./jspider.sh http://www.mypage.com | grep 'resource' | awk '{print $4}' | curl OUTPUT_FROM_AWK | grep myString
How can I achieve this?!
-
Robert over 12 yearsGreat! However in my case this doesn't work. If piping to "xargs echo" it also doesn't work. I guess output of jspider is too fast or sth..
-
HikeMike over 12 yearsIn that case, consider adding to your question or creating a new question dealing with your specific problem. I don't have this specific issue, e.g.
curl -s "http://superuser.com" | grep -E 'href="http://.*stackexchange\.com' | sed 's|^.*<a href="http://\([^"]*\)">.*$|http://\1|g' | grep -v "<" | xargs curl -s | grep "<title>"
works fine for me. (Yeah I know the code's extremely hacky). -
Robert over 12 yearsOh my.. I have to de-cypher that sed part in order to understand what you are doing, still it isn't working with jspider. I will start using perl now.
-
HikeMike over 12 years@StephanKristyn I extract URLs from hyperlinks, and since it doesn't work well enough, using
grep -v
I then remove all remaining lines with HTML tag brackets. Just a simple example that shows the approach works in general.