I have perl script for which I have created a log file and all the operations this script will do it will write in the log file.
It was working fine, I made some changes to fix some issues but now I am getting a strange problem, the File Handler is getting stuck during writing the log file, and it is not updating the log file until and unless I am exiting the script with exit (x) option.
For example, my script is offering to perform
Extract
Validate
Back up
X. Exit
now I am running the Extraction and is updating the log. But while the extrcation is complted i opened the log file and I can see the last line like
Date: XYZ file extracted.
Date: XXXX file is preparing to extract
Date:
and nor further update though it has extracted all the files, then I exit the script using X option and only after that I can see the complete log.
I am not getting why it is getting stuck and this is happening for all the other options as well. Previously it was working fine. I am simply using file handler and print to redirect it.
open FILE, ">>log.txt"
## DO some thing
print FILE $output;
can some one tell me what might be the issue ?
You're encountering file buffering. This is a normal thing most programming languages do. Because writing to disk or terminal is slow, each print statement doesn't necessarily write immediately. The output is usually stored in memory (ie. "buffered") and written when a certain amount or certain character is reached.
By default, STDERR
is not buffered. STDOUT
to a terminal is usually line buffered, meaning it will write when it sees a newline. Files are buffered in blocks, usually 4K or 8K it depends on your system.
For a log file, it makes sense to turn this buffering off. All filehandles respond to IO::Handle methods, so use the IO::Handle method autoflush
to do this.
# Because you've forgotten to check if the file opened.
use autodie;
open my $fh, ">>", $log_file;
$fh->autoflush(1);