Search code examples
pythondatabasefilegetpython-requests

How do you read URLs from a .txt file to perform requests.get and save the responses to a file?


I am able to save the response of a single URL but I have a list of URLs in a .txt file and would like to print the all the responses. How can I read the URLs from the .txt file and save the responses in python?

This is what I currently have. Thanks!

import requests

data = requests.get('www.url.com')

with open('file.txt', 'wb') as f:
    f.write(data.text)

Solution

  • you first want to read the urls from a file 'infile.txt', then iteratively send the requests and write the data to an outfile 'outfile.txt'.

    with open('infile.txt', 'r') as f:
        urls = f.readlines()
    
    datalist=[]
    for url in urls:
        data = requests.get(url)
        datalist.append(data.text)
    
    with open('outfile.txt', 'w') as f:
        for item in datalist:
            f.write("%s\n" % item)