Our site has a .csv file sent over everyday which we user to import data from a different system. The import works correctly sometimes, but I'm running into a problem where the script is importing repeat data instead of using the new file.
The system will continue to import the old file until I login through SFTP to view the file, then it will grab the new file. It seems like the script is loading the old file into memory then not clearing it out.
for example:
The system will continue to import the old version of the file until I login through SFTP.
My code for the import is below. Does anything here explain what might be happening?
function energyuportal_cron () {
if (($handle = fopen($CFG->dirroot.'/'."report.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 0, ",")) !== FALSE) {
if (energyuportal_check_data($data[1], $data[4])) {
// This imports the data
energyuportal_manage_completions($data[1], $data[4], $data[5], $data[7]);
}
}
fclose($handle);
rename($CFG->dirroot.'/'.$CFG->energyuportal_filelocation.'/'."report.csv",
$CFG->dirroot.'/'.'/old/'."report".date("Y-m-d-H-ia").".csv");
} else {
// Error
}
return true;
}
Log the result of stat() on the file before doing an fopen(). Also try adding clearstatcache().
If you were doing an include() I would look at the caching settings of any byte code compiler, but you are not. Therefore I would start to question the hosting environment and fs layer.