I have a PowerShell script that I want to run on 5,000+ endpoints. The script enumerates a bunch of information about each host which I then want to be able to ingest into another program for analysis.
The problem I'm having is that Invoke-Command does not allow me to process the data returned from each machine as it's received. This is an issue because with 5,000+ endpoints I run into memory constraints on the machine I'm running Invoke-Command from.
Ideally I'd like to be able to execute a script block each time I receive a response from a machine. Something similar to this, for example:
$ProcessOutput = {
# $_ = Data returned from one machine
$_ | Out-File ($_.PSComputerName)
}
Invoke-Command -FilePath $ScriptPath -ComputerName $Computers -ProcessingScriptBlock $ProcessOutput
Is this already possible and I am overlooking something? Or is the best solution to just split my computer list into chunks and scan each chunk one by one (resulting in longer run times)?
You can use jobs, foreach -parallel
(in a workflow) or runspaces (faster, but more complicated than jobs) to run parallel workloads. Here's an example using jobs which is the easiest to read.
$ProcessOutput = {
# $_ = Data returned from one machine
$_ | Out-File "$($_.PSComputerName).txt"
}
$Computers = "localhost", "frode-pc"
#Start job
$mainjob = Invoke-Command -FilePath $ScriptPath -ComputerName $Computers -AsJob
#Process results as they complete
while ($mainjob.State -eq "Running") {
$mainjob.ChildJobs | Where-Object { $_.State -eq 'Completed' } | Receive-Job | ForEach-Object $ProcessOutput
}
#Process last job(s)
$mainjob.ChildJobs | Where-Object { $_.State -eq 'Completed' } | Receive-Job | ForEach-Object $ProcessOutput
If performance is critical, use runspaces. Check out Invoke-Parallel
for an easy-to-use implementation of runspaces