Search code examples
powershellloggingioerror

Powershell Get-Content with Wait flag and IOErrors


I have a PowerShell script that spawns x number of other PowerShell scripts in a Fire-And-Forget way.

In order to keep track of the progress of all the scripts that I just start, I create a temp file, where I have all of them write log messages in json format to report progress.

In the parent script I then monitor that log file using Get-Content -Wait. Whenever I receive a line in the log file, I parse the json and update an array of objects that I then display using Format-Table. That way I can see how far the different scripts are in their process and if they fail at a specific step. That works well... almost.

I keep running into IOErrors because so many scripts are accessing the log file, and when that happens the script just aborts and I lose all information on what is going on.

I would be able to live with the spawned scripts running into an IOError because they just continue and then I just catch the next message. I can live with some messages getting lost as this is not an audit log, but just a progress log.

But when the script that tails the log crashes then I lose insight.

I have tried to wrap this in a Try/Catch but that doesn't help. I have tried setting -ErrorAction Stop inside the Try/Catch but that still doesn't catch the error.

My script that reads looks like this:

function WatchLogFile($statusFile)
{
    Write-Host "Tailing statusfile: $($statusFile)"
    Write-Host "Press CTRL-C to end."
    Write-Host ""
    Try {
    Get-Content $statusFile -Force -Wait | 
        ForEach { 
            $logMsg = $_ | ConvertFrom-JSON
            #Update status on step for specific service
            $svc = $services | Where-Object {$_.Service -eq $logMsg.Service}
            $svc.psobject.properties[$logMsg.step].value = $logMsg.status

            Clear-Host
            $services | Format-Table -Property Service,Old,New,CleanRepo,NuGet,Analyzers,CleanImports,Build,Invoke,Done,LastFailure
        } -ErrorAction Stop
    } Catch {
        WatchLogFile $statusFile
    }
}

And updates are written like this in the spawned scripts

Add-Content $statusFile $jsonLogMessage

Is there an easy way to add retries or how can I make sure my script survives file locks?


Solution

  • As @ChiliYago pointed out I should use jobs. So that is what I have done now. I had to figure out how to get the output as it arrived from the many scripts.

    So I did added all my jobs to an array of jobs and and monitored them like this. Beware that you can receive multiple lines if your script has had multiple outputs since you invoked Receive-Job. Be sure to use Write-Output from the scripts you execute as jobs.

    $jobs=@()
    foreach ($script in $scripts)
    {
        $sb = [scriptblock]::create("$script $(&{$args} @jobArgs)")
        $jobs += Start-Job -ScriptBlock $sb
    }
    
    while ($hasRunningJobs -gt 0)
    {
        $runningJobs = $jobs | Where-Object {$_.State -eq "Running"} | measure
        $hasRunningJobs = $runningJobs.Count
    
        foreach ($job in $jobs)
        {
            $outvar = Receive-Job -Job $job
            if ($outvar)
            {
                $outvar -split "`n" | %{ UpdateStatusTable $_} 
            }
        }
    }
    
    Write-Host "All scripts done."