Multi Thread Requests Python3

15,364

Solution 1 - concurrent.futures.ThreadPoolExecutor fixed number of threads

Using a custom function (request_post) you can do almost anything.

import concurrent
import requests

def request_post(url, data):
    return requests.post(url, data=data)

with concurrent.futures.ThreadPoolExecutor() as executor: # optimally defined number of threads
    res = [executor.submit(request_post, url, data) for data in names]
    concurrent.futures.wait(res)

res will be list of request.Response for each request made wrapped on Future instances. To access the request.Response you need to use res[index].result() where index size is len(names).

Future objects give you better control on the responses received, like if it completed correctly or there was an exception or time-out etc. More about here

You don't take risk of problems related to high number of threads (solution 2).


Solution 2 - multiprocessing.dummy.Pool and spawn one thread for each request

Might be usefull if you are not requesting a lot of pages and also or if the response time is quite slow.

from multiprocessing.dummy import Pool as ThreadPool
import itertools
import requests

with ThreadPool(len(names)) as pool: # creates a Pool of 3 threads 
    res = pool.starmap(requests.post(itertools.repeat(url),names))

pool.starmap - is used to pass (map) multiple arguments to one function (requests.post) that is gonna be called by a list of Threads (ThreadPool). It will return a list of request.Response for each request made.

intertools.repeat(url) is needed to make the first argument be repeated the same number of threads being created.

names is the second argument of requests.post so it's gonna work without needing to explicitly use the optional parameter data. Its len must be the same of the number of threads being created.

This code will not work if you needed to call another parameter like an optional one

Share:
15,364

Related videos on Youtube

For This
Author by

For This

Updated on June 04, 2022

Comments

  • For This
    For This almost 2 years

    I have researched a lot on this topic but the problem is am not able to figure out how to send multi-threading post requests using python3

    names = ["dfg","dddfg","qwed"]
    
    for name in names :
        res = requests.post(url,data=name)
        res.text 
    
    

    Here I want to send all these names and I want to use multi threading to make it faster.

    • bigbounty
      bigbounty almost 4 years
      It's better if you use asyncio and aiohttp for python 3
    • Felipe
      Felipe almost 4 years
      +1 @bigbounty. I can't stress this enough. OP, check this out.
  • For This
    For This almost 4 years
    Actually can you add comments not so known to python right now so that i can understand this code a little bit more that what actually what part is doing
  • For This
    For This almost 4 years
    Maybe this second one will work but no idea that how i can implement it here
  • For This
    For This almost 4 years
    while i < len(name): print(i,idx) try: loadproxy = { "https":"https://"+proxys[idx], } myobj= {"name":name[i],"stats":playerStats[i]} res = requests.post(url, json = myobj,proxies=loadproxy,timeout=100) i+=1 idx+=1 if(SuccessKey): if(SuccessKey in res.text): print(res.text) else: print("Can't add Failed") elif(FailureKey): if(FailureKey not in res.text): print(res.text) else: print("Can't add Failed") except Exception as e: print("Something Went Wrong") idx+=1 continue
  • For This
    For This almost 4 years
    If You can guide it will be helpfull
  • imbr
    imbr almost 4 years
    can you post this detailed requeriments on the body of the question? I am quite not getting your problem you are trying to solve.
  • imbr
    imbr almost 4 years
  • Richard
    Richard almost 4 years
    Note that with concurrent futures both ThreadPool (which you want to use for IO bound tasks like this) and ProcessPool both have defaults for number of workers that usually work pretty optimally. So in the above concurrent.futures example you can leave the worker count empty and make the best use of resources: with concurrent.futures.ThreadPoolExecutor() as executor: See docs.python.org/3/library/concurrent.futures.html
  • imbr
    imbr almost 4 years
    thanks @Richard you are right. I will comment that on the answer.