Urllib for python seems to be incedibly slow at uploading a file (using multipart/form-data)
The browser (Chrome) does it in under 20 seconds, while the script uses almost i minute for the same file.
I'm using urllib2 for the connection, and poster to create the http headers and data, the version of python in question is 2.7.
def upSong(fileName):
datagen, headers = multipart_encode({"mumuregularfile_0": open(fileName, "rb")})
uploadID = math.floor(random.random()*1000000)
request = urllib2.Request("http://upload0.mumuplayer.com:443/?browserID=" + browserID + "&browserUploadID=" + str(uploadID), datagen, headers)
urllib2.urlopen(request).read()
Is there a way to speed up pythons/urllibs connection, or is this just a limitation of the python language?
EDIT: it should be noted that i already tested all parts, and it is without a doubt urllib.read()
Chromium probably used compression (if supported by the website), while urllib does not look like using it (grepping "gz" or "bz" gives no result).
I am not sure about it, but Chromium may also be more optimized than traditional connections, using socket hacks or something...