Python requests error 10060
18,282
Try increasing the timeout
parameter of your requests.get
method :
requests.get(functionurl, headers=headers, timeout=5)
But the odds are that your script is being blocked by the server to prevent scrapping attempts . If this is the case you can try faking a web browser by setting appropriate headers .
{"User-Agent": "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 GTB7.1 (.NET CLR 3.5.30729)", "Referer": "http://example.com"}
Author by
brian
Updated on June 04, 2022Comments
-
brian almost 2 years
I have a script that crawls a website. Untill today it ran perfect, However it does not do so now.
it give sme the following error:
Connection Aborted Error(10060 ' A connection attempt failed becvause the connected party did not properly respond after a period of time, or established a connection failed because connected host has failed to respond'
I have been looking into answers ans settings but i cannot figure out how to fix this...
In IE i am not using any Proxy (Connection -> Lan Settings-> Proxy = Disabled)
it breaks in this piece of code, somethimes the first run, somethimes the 2nd.. and so on
def geturls(functionurl, runtime): startCrawl = requests.get(functionurl, headers=headers) mainHtml = BeautifulSoup(startCrawl.content, 'html.parser') mainItems = mainHtml.find("div",{"id": "js_multiselect_results"}) for tag in mainItems.findAll('a', href=True): tag['href'] = urlparse.urljoin(url,tag['href']) if shorturl in tag['href'] and tag['href'] not in visited: if any(x in tag['href'] for x in keepout): falseurls.append(tag['href']) elif tag['href'] in urls: doubleurls.append(tag['href']) else: urlfile.write(tag['href'] + "\n") urls.append(tag['href']) totalItemsStart = str(mainHtml.find("span",{"id": "sab_header_results_size"})) if runtime == 1: totalnumberofitems[0] = totalItemsStart totalnumberofitems[0] = strip_tags(totalnumberofitems[0]) return totalnumberofitems
How can i fix this?
-
brian over 9 yearsI haven't got the timeout parameter but I do have the headers, (hence headers=headers). What I don't have is the .net and referer parameter. What does referer do?