I've purchased a shared hosting and they have set 30s limit for php files execution. My script is run using CRON and its execution time can take more than 30 seconds.
Basically in my script there is a foreach loop that adds entries to the DB.
foreach( $items as $item ) {
// prepare pdo query etc.
$db->execute();
}
Is there any way to somehow omit the limit? Like pause the loop after 20s and then re-run or sth?
I can't use set_time_limit()
btw.
CRON job is run only once per day.
What you can do, is check how long the script is running in each iteration, and when you are closing in on the max execution time, save the last record you've processed. Then on a next run catch up and start with that particular record. It should work something like this:
// place this at the very start of your script.
$start_time = microtime(true); // true is important, otherwise it'll return a string.
$max_execution_time = ini_get('max_execution_time');
// logic to retrieve the records, take the last processed record into account
// so in your query do something like WHERE id > last_record_processed
foreach( $items as $item ) {
// do something interesting with the item
$current_time = microtime(true);
if( ( $current_time - $start_time) > ( $max_execution_time - 5) ) {
$last_record_processed = $item;
// store the id or whatever somewhere;
break;
}
}
You'll have to figure out for yourself what is the best way to store the last processed record, what safety margin you want to build in (now: 5 seconds), etc. etc.
And for this to work, you'll have the cron to be run every hour, or maybe even more regular. Otherwise it'll process a portion of your records each day. You should however check first if it is continueing the current batch, or has to start over (which should only happen if a day has passed).