I write curl php script which work is download csv file from one website after succesfully loged in. It works fine when i start it in my browser but it fails when i put it on cron jobs list. I seen memory exhausted error in my log once, so i guess my server give me less memory for cron.
How can i go around this problem?
You can see part of the code, which is doing download work, it's just usual :
<?php
...
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_POST, 1);
curl_setopt ($ch, CURLOPT_COOKIEJAR, 'cookie.txt');
curl_setopt ($ch, CURLOPT_COOKIEFILE, 'cookie.txt');
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt ($ch, CURLOPT_POSTFIELDS, $postfields);
$fp = fopen("data.csv", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec ($ch);
curl_close ($ch);
fclose($fp);
Just want to say again, this is part of the code and everything works fine in browser and fails in cron jobs.
Finally get everything works with my first settings: cd /home7/philbike/public_html/atlanticauto/assets/components/cronmanager/ && php cron.php
.
I should write path to csv file as it is without using system variables:
/home7/philbike/public_html/atlanticauto/data.csv