Pages don't get blocked with Squid over HTTPS

16,341

Solution 1

I fixed my problem by writing ssl_bump server-first all and removing ssl_bump allow all. I'm not sure if it contributed to fixing the problem, but I also put these lines into my terminal:

/sbin/iptables -t nat -A PREROUTING -p TCP -s 127.0.0.1 --dport 80 -j REDIRECT --to-port 3128
/sbin/iptables -t nat -A PREROUTING -p TCP -s 127.0.0.1 --dport 443 -j REDIRECT --to-port 3128

Although this does not produce the "Access Denied" message when I try to navigate to blacklisted pages with HTTPS, it does give me "Proxy server is refusing connections" when I configured Firefox to use the Squid proxy for all protocols (i.e HTTP and HTTPS) and I added the root certificate I generated earlier (/usr/local/squid/etc/squid.pem).

Solution 2

Due to the fact that SSL is end-to-end encryption, a proxy such as Squid normally knows much less about an HTTPS request than it does on HTTP (http://wiki.squid-cache.org/Features/HTTPS#CONNECT_tunnel):

[Many] common parts of the request URL do not exist in a CONNECT request:

  • the URL scheme or protocol (e.g., http://, https://, ftp://, voip://, itunes://, or telnet://),
  • the URL path (e.g., /index.html or /secure/images/),
  • and query string (e.g. ?a=b&c=d)

To know more than this about an HTTPS request, Squid would have to perform what is basically a man-in-the-middle attack on its clients. The Squid documentation explains how to do so, but note that this comes with a few issues regarding privacy (your users should trust you with their normally-encrypted information, and web browsers may warn about the attack).

Share:
16,341

Related videos on Youtube

q3d
Author by

q3d

Updated on September 18, 2022

Comments

  • q3d
    q3d over 1 year

    I have set up Squid to block pages on my own system (i.e not on a network), and I'm trying to get SSL to work with page blocking. To this end, I've set up ssl-bump and installed the certificate to my browser.

    I want to block *.reddit.com/* (on both HTTPS and HTTP), but I want to allow the child URL *.reddit.com/r/LearnJapanese only (on both HTTP and HTTPS)

    Here is part of my squid.conf file:

    acl bad_domain url_regex "/usr/local/squid/etc/block.acl"
    acl good_domain url_regex "/usr/local/squid/etc/allow.acl"
    
    http_access deny bad_domain !good_domain
    http_access allow good_domain
    
    http_access allow localnet
    http_access allow localhost
    
    http_access deny all
    
    # Squid normally listens to port 3128
    http_port 3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=10MB cert=/usr/local/squid/etc/squid.pem
    
    ssl_bump allow all
    
    sslproxy_cert_error allow all
    sslproxy_flags DONT_VERIFY_PEER
    

    Contents of block.acl:

    ^http(s)?://(.+)?reddit\.com(.+)?$
    

    Contents of allow.acl:

    ^http(s)?://(.+)?reddit\.com/r/LearnJapanese(.+)?$
    

    It works fine of HTTP (i.e reddit.com/r/LearnJapanese can be accessed, but the rest of Reddit cannot), but I don't have the same luck with HTTPS.

    When I access Reddit over HTTPS, the pages are not blocked at all, but they should be (except for reddit.com/r/LearnJapanese of course).

    How can I block access to all of Reddit (aside from .reddit.com/r/LearnJapanese/) over both HTTP and HTTPS? Thank you.

    • heemayl
      heemayl about 9 years
      Try this regex ^http[s]?:\/\/(w{3}\.| \.)reddit\.com\/r\/LearnJapanese(\/)?$ Check regex101.com/r/xS1wF7/2 ....if any particular error shows up regarding regex character escaping/usage let me know..
    • q3d
      q3d about 9 years
      @heemayl, do you know if this egrep-compatible? Squid (as far as I read somewhere) will only accept egrep-style regexes, which is why I kept it simple. I think my regex works (because it works for HTTP), the question is why nothing works when I access reddit over HTTPS. I'll try your regex.
    • heemayl
      heemayl about 9 years
      I see..my regex is egrep compatible but your problem seems different, its not about regex..i heard people have very hard times blocking https though squid..search for relevant threads..also you can try serverfault to search for SysAdmins solutions regarding this..
    • Admin
      Admin almost 8 years
  • dhag
    dhag about 9 years
    I noticed after posting that you are already using ssl_bump. Then I'm not sure what's wrong; can you for sure whether your clients are seeing the Squid-provided certificate?
  • q3d
    q3d about 9 years
    dhag: I've installed the certificate on both of my browsers (Chrome and Firefox) and when I visit reddit.com, the CA is the correct one (i.e the one provided by my root certificate I intalled in my browsers). I am using ssl-bump because only I am using Squid, there are no other users, so I'm only violating my own privacy. Is there anything other than ssl-bump I need to set up?
  • dhag
    dhag about 9 years
    OK, then I really don't know what's wrong. Sorry for wasting your time :(. As a last stab, perhaps move the ssl_bump directive to be before the URL-based ACLs (I don't know whether that makes a difference, but we want to do the MITM before we can apply an URL regexp)?
  • q3d
    q3d about 9 years
    I have news. I changed my proxy configuration (I switched the configuration from system-wide to Firefox-only) and there is some filtering, but not how it should be. If I navigate to reddit.com/r/x (which works), then click onto reddit.com/r/y, I get an Access Denied message that says "Access denied to reddit.com/r/x". That is, the message is too late and I can access any page simply be refreshing it. Any idea what it could be?
  • q3d
    q3d about 9 years
    dhag: I tried moving the ssl-bump directives before the ACLs, but nothing changed. Thanks for your suggestion though