NGINX redirect if user agent contains XYZ

8,165

With ~* you are attempting to loosely match the string, but your regex is ^XYZ$ which says match only if it starts with XYZ and only if it ends in XYZ. That's not loose at all. You are kind of contradicting yourself there.

See documentation here: http://nginx.org/en/docs/http/ngx_http_core_module.html#location

You probably want to adjust your regex to allow for more variations. From your question it's hard for us to know what you would like to match though. Possibly something like:

if ($http_user_agent ~* ".*XYZ.*") {
   rewrite ^/(.*)$ https://www.example.com/$1 permanent;
}
Share:
8,165

Related videos on Youtube

Pikk
Author by

Pikk

Updated on September 18, 2022

Comments

  • Pikk
    Pikk over 1 year

    I have redirected my site from http to https. I want to allow the useragent XYZ and XYZ whatever to enter both http and https versions.

    Now the vhost looks like this:

    server {
            listen 80;
            server_name example.com www.example.com;
            rewrite  ^/(.*)$  https://www.example.com/$1 permanent;
    }
    

    How can I edit it, in order to allow agents that contain XYZ to see both HTTP and HTTPS? In order words... how to disable the redirect for such agents?

    I tried

    if ($http_user_agent ~* "^XYZ$") {
       rewrite ^/(.*)$ https://www.example.com/$1 permanent;
    }
    

    But this seems to redirect only XYZ. But I need to disable the redirect if XYZ, not strict XYZ, but containing XYZ...

  • Pikk
    Pikk over 6 years
    I have HTTP redirection to HTTPS. I want to add a condition that will allow Facebook crawler to see both http and https (so to disable the redirect for FB). The FB agent has several names, but all of them contain facebookexternalhit/1.1
  • JayMcTee
    JayMcTee over 6 years
    Advanced crawlers like Google and Facebook don't want you to modify application behaviour just for them. You basically want to show them something you will never show their trusted users. I wouldn't bother.
  • Pikk
    Pikk over 6 years
    No, I disagree. FB developers suggest it. developers.facebook.com/docs/plugins/faqs#faq_11496559684201‌​44 "This also requires that the old URL still renders a document with Open Graph tags and returns a HTTP 200 response, at least when loaded by Facebook's crawler. If you want other clients to redirect when they visit the URL, you must send your 301 HTTP response to all non-Facebook crawler clients. The old URL should contain its own og:url tag that points to itself. You can learn how to recognize Facebook's crawler in our Sharing Best Practices Guide."
  • JayMcTee
    JayMcTee over 6 years
    That snippet doesn't mean you can't serve them https results with status 200.
  • Pikk
    Pikk over 6 years
    If I want to transfer likes and shares that where given when the site was still http, then I need to allow the crawler to still see the old page.
  • JayMcTee
    JayMcTee over 6 years
    I suggest next time you put such info in the question. The fact it's Facebook and commonly used plugin functionality is hardly sensitive and helps us put it all in context, giving better answers. Either way, my answer should bring you a step in the right direction, as your regex was too restrictive.
  • Pikk
    Pikk over 6 years
    I will formulate better question in future. So that would work fine, thank you. if ($http_user_agent ~* ".*facebookexternalhit/1.1.*") { rewrite ^/(.*)$ https://www.example.com/$1 permanent; }