I have a large text file it contains 700k lines. I want to concatenation or append small string "/products/all.atom" with every single line.
I tried this code
enter code here
import os
import sys
import fileinput
print ("Text to search for:")
textToSearch = input( "> " )
print ("Text to replace it with:")
textToReplace = input( "> " )
print ("File to perform Search-Replace on:")
fileToSearch = input( "> " )
#fileToSearch = 'D:\dummy1.txt'
tempFile = open( fileToSearch, 'r+' )
for line in fileinput.input( fileToSearch ):
if textToSearch in line :
print('Match Found')
else:
print('Match Not Found!!')
tempFile.write( line.replace( textToSearch, textToReplace ) )
tempFile.close()
input( '\n\n Press Enter to exit...' )
But code is not working perfect, what i am doing here is replacing ".com" with ".com/products/all.atom" but when i run this command loop is going infinite and it writes file with 10gb of size.
Here is example what i want:
store1.com > store1.com/products/all.atom
store2.com > store2.com/products/all.atom
store3.com > store3.com/products/all.atom
please help me.
Try list comprehension and string concatenation:
with open('file.txt','r') as f:
print([line.strip() + "/products/all.atom" for line in f])
outout:
['store1.com/products/all.atom', 'store2.com/products/all.atom', 'store3.com/products/all.atom', 'store4.com/products/all.atom']
Updated solution:
Here is how you will write in new file:
with open('names','r') as f:
data=[line.strip() + "/products/all.atom" for line in f]
with open('new_file','w+') as write_f:
for line_1 in data:
write_f.write(line_1 + '\n')