Get final URL after curl is redirected
Solution 1
curl
's -w
option and the sub variable url_effective
is what you are
looking for.
Something like
curl -Ls -o /dev/null -w %{url_effective} http://google.com
More info
-L Follow redirects -s Silent mode. Don't output anything -o FILE Write output to <file> instead of stdout -w FORMAT What to output after completion
More
You might want to add -I
(that is an uppercase i
) as well, which will make the command not download any "body", but it then also uses the HEAD method, which is not what the question included and risk changing what the server does. Sometimes servers don't respond well to HEAD even when they respond fine to GET.
Solution 2
Thanks, that helped me. I made some improvements and wrapped that in a helper script "finalurl":
#!/bin/bash
curl $1 -s -L -I -o /dev/null -w '%{url_effective}'
-
-o
output to/dev/null
-
-I
don't actually download, just discover the final URL -
-s
silent mode, no progressbars
This made it possible to call the command from other scripts like this:
echo `finalurl http://someurl/`
Solution 3
as another option:
$ curl -i http://google.com
HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Sat, 19 Jun 2010 04:15:10 GMT
Expires: Mon, 19 Jul 2010 04:15:10 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">here</A>.
</BODY></HTML>
But it doesn't go past the first one.
Solution 4
You can do this with wget usually. wget --content-disposition
"url" additionally if you add -O /dev/null
you will not be actually saving the file.
wget -O /dev/null --content-disposition example.com
Solution 5
Thank you. I ended up implementing your suggestions: curl -i + grep
curl -i http://google.com -L | egrep -A 10 '301 Moved Permanently|302 Found' | grep 'Location' | awk -F': ' '{print $2}' | tail -1
Returns blank if the website doesn't redirect, but that's good enough for me as it works on consecutive redirections.
Could be buggy, but at a glance it works ok.
Related videos on Youtube
vise
If you're a beginner interested in ruby on rails, please visit my youtube channel.
Updated on November 27, 2020Comments
-
vise over 3 years
I need to get the final URL after a page redirect preferably with curl or wget.
For example http://google.com may redirect to http://www.google.com.
The contents are easy to get(ex.
curl --max-redirs 10 http://google.com -L
), but I'm only interested in the final url (in the former case http://www.google.com).Is there any way of doing this by using only Linux built-in tools? (command line only)
-
Gavin Mogan almost 14 yearsyou should be able to use "-o /dev/null" if you don't want the file
-
Josh almost 14 yearsThat's a great option, I never knew curl could do that! It never ceases to amaze me
:-)
-
user151841 about 12 yearsThat's more of a shell feature than curl
-
Zombo almost 10 years@DanielStenberg you need
-I
otherwise it will actually download the file. -
gw0 over 7 yearsThanks for those ideas. I rewrote it for terminal usage in my .bashrc file as a function, and there's no need for the terse options in that file, so I used the long names to self-document this:
finalurl() { curl --silent --location --head --output /dev/null --write-out '%{url_effective}' -- "$@"; }
-
Toolkit about 7 yearsdoesn't work.
curl -Ls -o /dev/null -w %{url_effective} https://goo dot gl/un5E
outputshttps://goo dot gl/un5E
(replacedot
with .) -
ArigatoManga almost 6 yearsyou are a genius !
-
Ivan Kozik over 5 yearsSome websites also need a spoofed user agent with
curl -A ...
to redirect to the expected location. -
Maxwel Leite almost 5 yearsReplace from
-O /dev/null
to only-O-
. Better:wget -O- --content-disposition example.com
-
Eric Klien almost 5 yearswget -O /dev/null --content-disposition example.com and wget -O- /dev/null --content-disposition example.com produce a lot more output than the redirected URL. curl $1 -s -L -I -o /dev/null -w '%{url_effective}' works fine for me.
-
SamB over 4 yearsSeems pretty uncommon that you'd know in advance that there would only be one redirect ...
-
stefanct over 3 yearsBe aware that this might reveal/print your password as part of the URL if you are using a .netrc file (-n) for authentication. It might end up somewhere unexpected when using this in a script (e.g. in the insecure parameter list of another command).
-
OZZIE over 3 yearsDoes this output query strings also? Like
?utm=x
? Doesn't seem like it -
Daniel Stenberg over 3 yearsSure it does. The query part is part of the URL.
-
theonlygusti almost 3 yearsHow come even with
-s
you need-o /dev/null
-
AndrewF over 2 years@theonlygusti
-o /dev/null
suppresses the HTTP response from being outputted anywhere.-s
is for curl's own output (like progress or errors) and specifically does not cover the HTTP response.