I'm building a BOT that can scrape prices of products of off ebay and write all the info into a csv file. But I'm having trouble when it inputs the information into the csv file as it's using a for loop. How do I write all the for loop data into the csv file. The code only writes the last product's info because of the for loop.
headers = {'User-Agent':'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.150 Safari/537.36'}
searchterm = "Laptops"
url = f'https://www.ebay.com/sch/i.html?_from=R40&_nkw={searchterm}&_sacat=0&LH_PrefLoc=1&LH_Auction=1&rt=nc&LH_Sold=1&LH_Complete=1'
response = requests.get(url,headers=headers)
soup = BeautifulSoup(response.content,'html.parser')
for item in soup.select('.s-item'):
print(len(soup.select('.s-item')))
try:
print(item.select('.s-item__title')[0].get_text())
print(item.select('.s-item__subtitle')[0].get_text())
print(item.select('.s-item__price')[0].get_text())
print(item.select('.s-item__location')[0].get_text())
print(item.select('.s-item__shipping')[0].get_text())
print("\n\n\n----------------------------------------------")
with open('Products.csv', mode='w') as file_in:
writers = csv.writer(writers)
employee_writer.writerow(['Product', 'Price'])
employee_writer.writerow(['', ''])
employee_writer.writerow([item.select('.s-item__title')[0].get_text(),item.select('.s-item__price')[0].get_text()])
except Exception as e:
print("ERROR!")
I want the tables to look like this
Structure the code so that you're opening the file outside of the loop:
with open('Products.csv', mode='w') as file_in:
writers = csv.writer(writers)
for item in soup.select('.s-item'):
# etc...
What's happening is that each iteration of the loop, you're re-opening the file, and with mode='w'
you're overwriting the file contents.