I have a powershell script that occasionally fails when I run it on a server. I can't reproduce the error on my local system.
The script writes logging information to a file. Lines like this one will fail occasionally:
Write-Output "`nRead the page count" | Out-File -LiteralPath $logfile -Append
Every step in the script gets an entry in the log, so the script writes to this file a lot: I end up with a log file of about 100 kB.
$logfile is a file on a network share. The same share is used for dozens of read and write actions over the course of the script. $logfile will contain about 100 kB of text when the script is done. If the above line fails, subsequent write commands to the same file will work.
Is there a way to make writing to this log file more robust?
A colleague confirmed that scripts tend to have trouble writing files to that particular network share. I ended up creating a temporary file:
$logfileTemp=New-TemporaryFile
all of the logging calls go to this file, which is stored locally on the computer running the script. This gets rid of any issues writing to the network share.
Then I made a function to copy the contents from this temporary file to a file on the network share. This function is called at all points where the script could stop processing, to make sure I can access the log file if the script fails.
function Exit-MyScript {
Get-Content -LiteralPath $logfileTemp | Out-File -LiteralPath $logfile -Append -Encoding utf8
Remove-Item -LiteralPath $logfileTemp | Out-File -LiteralPath $logfile -Append -Encoding utf8
exit
}