Search code examples
pythonmultithreadingfile-uploadpython-multithreading

Why MultiThreading is not efficient in my code?


So my code gets file names from filenames.txt and then sends a request with a file and gets the response.

For some reasons it uploads it multiple times for a single file and i can't understand why.

f = open('filenames.txt','r')
results = [x.strip() for x in f.readlines()]
def makerequest(files):
    for filename in results:
        files = {'file': open(filename, 'rb')}
        url = 'https://www.google.com/'
        r = requests.post(url,headers=headers2,files=files,cookies=cookies1)
        print(r.text)
processes = []
pool = multiprocessing.dummy.Pool(10)
pool.map(makerequest, results)
pool.close()
pool.join()

I want each thread to send a request with different filename e.g

Thread 1 uploads the file 1.txt

Thread 2 uploads the file 2.txt


Solution

  • You send single filename as argument in def makerequest(files): but you never use it but you use for-loop to work with all filenames in every process. You have to remove for-loop.

    def makerequest(filename):
        files = {'file': open(filename, 'rb')}
        url = 'https://www.google.com/'
        r = requests.post(url, headers=headers2, files=files, cookies=cookies1)
        print(r.text)