I have a PHP script located at /var/www/site/update.php full
. The script is started automatically from cron:
/usr/bin/php /var/www/site/update.php full
But when I start the same script from my website:
<?php exec("/usr/bin/php /var/www/site/update.php full") ?>
It runs for about 20 minutes and then starts displaying a lot of error messages. At the same time the page stops loading and writes error 504 Gateway Time-out
onto the screen.
I guess curl won't help either. Are there any other options? the question is how to run so it could work independently of the browser. the code already exists and worked out. now it runs through the cron every hour. but there is a need to run it unscheduled by pressing a button or link on the site.
you can modify set_time_limit(-1);
in the PHP script to prevent PHP from dying but 504 sounds like you have nginx running, so you will also have to modify it's time limits for that particular resouce using location
config node (or main ngix config) to set *_timeout settings
if you don't need the output to be refreshed every second, I'd do this:
when button is pressed, store the flag "pressed" in some file
if($_POST['schedule_script_button']){ touch(DIR."/script_is_scheduled.flag"); }
use cron to run the script every minute
* * * * * php /var/www/example.com/my-long-script.php
script would have to check if that flag is set, otherwise terminate
if(!file_exists(DIR."/script_is_scheduled.flag")){ exit; } //full script logic executed here
if flag is set, run the script, and save output into a file. if you are going with crontab, ou can use echo in the script, and the crontab should be modified as follows:
* * * * * php /var/www/example.com/my-long-script.php > /var/www/example.com/my-script-output.txt
my-script-output.txt
by using file_get_contents
PHP function and simply echo its contents.I'm against setting execution limit on web-urls to high values, this can be used by attackers to put your site down by taking its resources by long-scripts execution