I need to download approximately 1000 file/url and it will be hard to download them manually.
I tried to put the urls in a list and loop through the list but it I think my code overwrite the previous files and keep only the last item in the list
Here is my code
#!/usr/bin/env python
import urllib3
http = urllib3.PoolManager()
urls = ["http://url1.nt.gz" , "http://url2.nt.gz" , "http://url3.nt.gz"]
N =1; // counter helps me to rename the downloaded files
print "downloading with urllib"
for url in urls
r = http.request('GET',url)
Name =str(N+1) // each time increment the counter by one
with open("file"+Name+".nt.gz", "wb") as fcont:
fcont.write(r.data)
Any suggestions?
print "downloading with urllib"
for url in urls
r = http.request('GET',url)
Name += N