In a PHP script intended to work on generic shared LAMP hosting, I am using require()
as a function to read data from a file. The data file looks like this:
<?php
# storage.php
return [
// Rotating secrets
'last_rand' => '532e89355b78aafdb85f5f01f0eed20440d6bd9e0a2d6ae1bd17be4e1d7d21c7bb7a822a2077e3f4',
];
And I read it like this:
$data = require($root . '/storage.php');
$lastRand = $data['last_rand'];
In my script, I will read information from a configuration file using this approach. In some cases, the operation will re-write a new config file (using file_put_contents()
) and then re-read it, again using require()
.
This has worked solidly so far, on a variety of LAMP hosts, but today I found a web host where the require()
does not seem to notice changes in the file. This host is using PHP 5.6, whereas all the others I have tested are using 7+.
What's extremely odd is that require()
gets the old contents of the file even though file_get_contents()
can see the new version, as if require()
is doing its own internal caching.
I have even tried waiting for require()
to catch up, in case some underlying cache needed time to expire:
$ok = true;
$t = microtime(true);
while ($oldRandom != $this->getFileService()->requirePhp($this->getStoragePath())['last_rand'])
{
$elapsedTime = microtime(true) - $t;
if ($elapsedTime > 4) // Wait 4 seconds
{
$ok = false;
break;
}
usleep(1000);
}
That did not work either (the timeout just expires with no change in the results brought back from require()
).
However, upon the next run of the PHP script, require()
suddenly sees the new file contents.
I've also tried to delete the storage.php
file before re-reading it, and that has no effect either! I was really expecting that to help.
So, I don't understand this behaviour at all. To work around it I could:
file_get_contents()
rather than require()
(for rather dull reasons it's not as convenient, but I'll prefer whatever works reliably).However, these both feel like I am ignoring a bug, and that I should investigate the cause. I happen to know also that this host (a free one) is nearly always under heavy CPU load. It's a 64 bit server running Linux on the rather old 2.6 kernel, and phpinfo()
indicates the CGI/FastCGI server API is in force.
Can anything be done to mitigate this behaviour?
Some helpful commenters suggest this problem could be related to opcaching, which seems to fit the general purpose of require()
. I've added this code after the file_put_contents()
that re-writes the config:
$reset = opcache_reset(); // Returns true
$invalid = opcache_invalidate($this->getStoragePath()); // Returns false
However, it makes no difference - require()
stubbornly reads the same value. I have confirmed this by doing a require()
before and after the file write, and also a file_get_contents()
to get the true contents.
When I originally encountered this error, I was hesitant to work around it, since it felt like a critical bug that ought not be ignored. However, now that the culprit is likely to be opcaching (and thus related to require()
specifically), I think working around it by keeping values in memory is not a bad solution. I shall have to remember that require()
is only good for one call per HTTP request, even if that does not hold true for all PHP installations.
I am conscious also that if I did try to work around this, I would have to contend with a number of opcache invalidation mechanisms, which is more complexity than I am comfortable with.