Alright I'm saving contents from a web crawler into a textfile, but problem is the crawler goes through multiple urls, so I use a foreach loop on the urls array..
But when I'm printing my data into file, each array pulls 49 entries, and every 49th line theres a double entry.
Is there anyway I can line break after printing each array from foreach loop?
foreach($urls as $url) {
$source = file_get_contents($url);
$roughHtml = rough_html($source);
$scraped = extract_ips($roughHtml);
$readyD = implode("\n", $scraped);
file_put_contents($filename, $readyD, FILE_APPEND);
}
It's printing the array fine; and the data is in the file; but every 49th line is:
124.232.136.12:2160
196.201.216.170:779186.89.105.127:8080
186.95.69.6:8080
Any help with this issue?
I think you need to add a new line to $readyD
in file_put_contents
.
file_put_contents($filename, $readyD . "\n", FILE_APPEND);
^^