Currently I am writing urls to a txt file using the following.
$data['url'] = $element->href;
$data['image'] = $image->content;
$data['title'] = $title2->plaintext;
$data['genre'] = $genre->plaintext;
file_put_contents('done.txt', $data['url'] . PHP_EOL, FILE_APPEND);
The above is being used inside of a foreach loop, and the website I am collecting data from has 39 results per page, I am wondering how I could delete the previous 39 results on the 40th post without removing the current input flow, that way the txt file has a maximum of 39 urls in it at one time.
EDIT: Sloppy mistake on my part, sorry.
$html = file_get_html('http://oursite.com');
foreach($html->find('.ourclass') as $element)
{
$data['url'] = $element->href;
$data['image'] = $image->content;
$data['title'] = $title2->plaintext;
$data['genre'] = $genre->plaintext;
file_put_contents('done.txt', $data['url'] . PHP_EOL, FILE_APPEND);
}
Above is how my foreach
is laid out.
Change the flags argument based on the current loop iteration.
Assuming $i
is the current iteration index in your foreach
loop...
file_put_contents('dont.txt', $data['url'] . PHP_EOL, $i % 39 ? FILE_APPEND : 0);
This will empty the file and start writing new records every 40th iteration.