We need to untar/unzip a very large archive file on a server that we don't have SSH access to. One of our alternative options is to execute a system command via PHP to unzip the file.
The problem is that its takes quite a while to unzip/untar the file. PHP ends up timing out before it completes.
Is there a way to make the untar command execute fully whether or not the PHP script times out or not? Like an Asynchronous execution?
Or how can you temporarily set the PHP timeout period to be extremely high so that it has enough time to finish untarring the file?
From the manual:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
What may be happening is that your client (browser) is timing out. Such a disconnection may cause the script to abort. To prevent that, you can do:
ignore_user_abort(true);
You'll then want a second script that allows you to check the progress. One idea is to have the output of tar written to a file that your second script can check the end of to see if it's done. Another is to invoke a shell script that in turns runs tar. That shell script could either check the return code and touch a file with its results or it could create a rudimentary lock file while it is working.