SSL failure on Windows using python requests
Solution 1
It is possible to get the Requests library to use Python's inbuilt ssl
module to make the SSL portion of the HTTP connection. This is doable because the urllib3 utils that Requests uses allow passing a Python SSLContext into them.
However, note that this may depend on the necessary certificates already being loaded into the trust store based on a previous Windows access (see this comment)
Some sample code follows (this needs a recent version of Requests; it works with 2.18.4):
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.ssl_ import create_urllib3_context
class SSLContextAdapter(HTTPAdapter):
def init_poolmanager(self, *args, **kwargs):
context = create_urllib3_context()
kwargs['ssl_context'] = context
context.load_default_certs() # this loads the OS defaults on Windows
return super(SSLContextAdapter, self).init_poolmanager(*args, **kwargs)
s = requests.Session()
adapter = SSLContextAdapter()
s.mount('https://myinternalsite', adapter)
response = s.get('https://myinternalsite')
Solution 2
Requests doesn't use your Windows root CA store like your browser does.
From the docs: By default, Requests bundles a set of root CAs that it trusts, sourced from the Mozilla trust store. However, these are only updated once for each Requests version.
This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.
You can literally do this:
cafile = 'cacert.pem' # http://curl.haxx.se/ca/cacert.pem
r = requests.get(url, verify=cafile)
Or you can use certifi if your CA cert is signed by a public entity.
user6357781
Updated on May 09, 2020Comments
-
user6357781 almost 4 years
Apologies for the very long post, but I'm really trying to be thorough...
I have a dedicated web site that serves as bridge to exchange data between various environmental models operated from remote servers and running on different types of OSes (Linux, MacOS and Windows). Basically each server can upload/download data files to the web site, and files are then used for further processing with a different model on another server.
The web sites has some basic protection (IP filtering, password and SSL using LetsEncrypt certificates). All the remote servers can access the site and upload/download data through a simple web interface that we have created.
Now we are trying to automate some of the exchange with a simple python (2.7) daemon (based on the requests module). The daemon monitors certain folders and uploads the content to the web site.
The daemon works fine on all of the remote servers, except for one running Windows 7 Enterprise 64bit. This server has Python 2.7.13 installed and the following packages: DateTime (4.1.1), psutil (5.2.0), pytz (2016.10), requests (2.13.0), zope.interface (4.3.3).
From this server the SSL connection works fine through a web browser, but the daemon always returns:
raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)
Here is what we tried so far:
- setting verify=false. This works fine, but we cannot use it in our final production environment..
- copying the certificate from another server where the daemon works, and setting verify=(name of the certificate file) (no success)
- setting the 'User-agent' to the exact same string that we get from the Windows machine on the web site when the connection is done with a web browser (no success)
What other setting should we be looking at on the Windows server to try to solve the problem? Can it be a firewall setting that somehow allows the browsers SSL connection through but blocks the python daemon?
UPDATE
The organization that is running the Windows remote server that was producing the error substitutes all SSL certificates at the proxy level.
Their IT people solved our problem by adding the URL of our web site to the list of "pass through" sites on their proxy settings.This works and it's fine for now. However I'm wondering if we could have handled the certificate substitution directly in python...
-
user6357781 about 7 yearsThanks for your reply. We already tried this approach (second bullet point), but without success. I edited my question to make sure it's cleared.
-
Artagel about 7 yearsWhich certificate did you copy? You need the issuing certificate authority cert, not the webserver cert.
-
user6357781 about 7 yearsWe copied the cacert.pem file from another Windows server where the daemon works. From C:\Python27\lib\site-packages\requests\cacert.pem
-
Artagel about 7 yearsIf it is working on another machine, with the same code, then there's something up with the file, like getting mangled on copy. Otherwise I'm not sure.
-
user6357781 about 7 yearsThe file is fine... we tried the same file on other Windows machines to double check. That's why we are questioning other possible settings at the OS level...
-
Artagel about 7 yearsperhaps your requests is old, or your ssl libraries are old and don't support required ciphers? It makes no sense trying to come up with a different solution, fix the root problem on your box.
-
user6357781 about 7 yearsI'm not sure what you mean by "solve the root problem on your box"... If you read my question you'll see that everything works just fine on every other system we have tried it on. And the machine on which it does not work has exactly the same version of python and requests (and all the other installed packages) as all the other servers... so I'm just trying to figure out what else I can (or should) be looking into on the machine on which it does not work.
-
Artagel about 7 yearsRight, my point is the problem is not the python code. It is another issue on your computer.
-
kraussian over 5 yearsDoesn't work for me. I'm using Requests v2.19.1, and it gives me this error:
'PyOpenSSLContext' object has no attribute 'load_default_certs'
-
jakob.j over 5 yearsFor Python 3.6.5 and requests 2.19.1, I had to replace the
create_urllib3_context
import withimport ssl
and then change the context's assignment to becontext = ssl.create_default_context()
. -
David Fraser about 5 yearsThanks for the feedback. On Windows 10, using Python 2.7.15 and requests 2.21.0, the
create_urllib3_context
import and code above still works correctly for me. -
David Fraser about 5 years@kraussian, what platform are you running on? How did you install Python?
-
David Fraser about 5 years@tm1212, on requests 2.18.4, 2.19.0 and 2.21.0, the call to
SSLContext
fails for me withTypeError: __new__() takes at least 2 arguments: 1 given
. Are you doing something different? -
David Fraser about 5 years@jakob.j you raise a good point ... the ssl
create_default_context
method actually does theload_default_certs
operation automatically, and this is a simpler way to set up the context without having to import a private member from urllib3. This seems to work on both Python 2.7 and Python 3. My testing of the above code also works on Python 3 but it could be simpler using your method -
Tom M. about 5 yearsI had it working with Python3 and only tried it against requests 2.12.4 and 2.19.1 . I removed my comment as @jakob.j has the better solution.
-
Josh about 5 yearsTried this solution and it works with python 3.7 requests 2.21.0 Is there a way to apply this solution to all requests without a session?
-
cowlinator almost 5 years@Josh, Every call to
requests.get()
, etc. always uses a session behind the scenes. So, there's no way to userequests
without a session. -
cowlinator over 4 yearsYou can also set
verify=
to a directory containing certificates, but you must first runopenssl rehash
on the directory. See requests.kennethreitz.org/en/master/user/advanced/… for more info. -
hoefling about 4 yearsAwesome, thanks for the recipe. Confirmed working on Win 10, Python 3.6.8,
requests==2.20
. -
shadowtalker over 3 yearsIs there a way to make this work for any site, instead of just a specific domain? This is needed if you are on a corporate network that MitMs all HTTPS requests.