Get final URL after curl is redirected

158,134

Solution 1

curl's -w option and the sub variable url_effective is what you are looking for.

Something like

curl -Ls -o /dev/null -w %{url_effective} http://google.com

More info

-L         Follow redirects
-s         Silent mode. Don't output anything
-o FILE    Write output to <file> instead of stdout
-w FORMAT  What to output after completion

More

You might want to add -I (that is an uppercase i) as well, which will make the command not download any "body", but it then also uses the HEAD method, which is not what the question included and risk changing what the server does. Sometimes servers don't respond well to HEAD even when they respond fine to GET.

Solution 2

Thanks, that helped me. I made some improvements and wrapped that in a helper script "finalurl":

#!/bin/bash
curl $1 -s -L -I -o /dev/null -w '%{url_effective}'
  • -o output to /dev/null
  • -I don't actually download, just discover the final URL
  • -s silent mode, no progressbars

This made it possible to call the command from other scripts like this:

echo `finalurl http://someurl/`

Solution 3

as another option:

$ curl -i http://google.com
HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Sat, 19 Jun 2010 04:15:10 GMT
Expires: Mon, 19 Jul 2010 04:15:10 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">here</A>.
</BODY></HTML>

But it doesn't go past the first one.

Solution 4

You can do this with wget usually. wget --content-disposition "url" additionally if you add -O /dev/null you will not be actually saving the file.

wget -O /dev/null --content-disposition example.com

Solution 5

Thank you. I ended up implementing your suggestions: curl -i + grep

curl -i http://google.com -L | egrep -A 10 '301 Moved Permanently|302 Found' | grep 'Location' | awk -F': ' '{print $2}' | tail -1

Returns blank if the website doesn't redirect, but that's good enough for me as it works on consecutive redirections.

Could be buggy, but at a glance it works ok.

Share:
158,134

Related videos on Youtube

vise
Author by

vise

If you're a beginner interested in ruby on rails, please visit my youtube channel.

Updated on November 27, 2020

Comments

  • vise
    vise over 3 years

    I need to get the final URL after a page redirect preferably with curl or wget.

    For example http://google.com may redirect to http://www.google.com.

    The contents are easy to get(ex. curl --max-redirs 10 http://google.com -L), but I'm only interested in the final url (in the former case http://www.google.com).

    Is there any way of doing this by using only Linux built-in tools? (command line only)

  • Gavin Mogan
    Gavin Mogan almost 14 years
    you should be able to use "-o /dev/null" if you don't want the file
  • Josh
    Josh almost 14 years
    That's a great option, I never knew curl could do that! It never ceases to amaze me :-)
  • user151841
    user151841 about 12 years
    That's more of a shell feature than curl
  • Zombo
    Zombo almost 10 years
    @DanielStenberg you need -I otherwise it will actually download the file.
  • gw0
    gw0 over 7 years
    Thanks for those ideas. I rewrote it for terminal usage in my .bashrc file as a function, and there's no need for the terse options in that file, so I used the long names to self-document this: finalurl() { curl --silent --location --head --output /dev/null --write-out '%{url_effective}' -- "$@"; }
  • Toolkit
    Toolkit about 7 years
    doesn't work. curl -Ls -o /dev/null -w %{url_effective} https://goo dot gl/un5E outputs https://goo dot gl/un5E (replace dot with .)
  • ArigatoManga
    ArigatoManga almost 6 years
    you are a genius !
  • Ivan Kozik
    Ivan Kozik over 5 years
    Some websites also need a spoofed user agent with curl -A ... to redirect to the expected location.
  • Maxwel Leite
    Maxwel Leite almost 5 years
    Replace from -O /dev/null to only -O-. Better: wget -O- --content-disposition example.com
  • Eric Klien
    Eric Klien almost 5 years
    wget -O /dev/null --content-disposition example.com and wget -O- /dev/null --content-disposition example.com produce a lot more output than the redirected URL. curl $1 -s -L -I -o /dev/null -w '%{url_effective}' works fine for me.
  • SamB
    SamB over 4 years
    Seems pretty uncommon that you'd know in advance that there would only be one redirect ...
  • stefanct
    stefanct over 3 years
    Be aware that this might reveal/print your password as part of the URL if you are using a .netrc file (-n) for authentication. It might end up somewhere unexpected when using this in a script (e.g. in the insecure parameter list of another command).
  • OZZIE
    OZZIE over 3 years
    Does this output query strings also? Like ?utm=x ? Doesn't seem like it
  • Daniel Stenberg
    Daniel Stenberg over 3 years
    Sure it does. The query part is part of the URL.
  • theonlygusti
    theonlygusti almost 3 years
    How come even with -s you need -o /dev/null
  • AndrewF
    AndrewF over 2 years
    @theonlygusti -o /dev/null suppresses the HTTP response from being outputted anywhere. -s is for curl's own output (like progress or errors) and specifically does not cover the HTTP response.