I've been struggling with this for a week now and have exhausted all the methods and options I have found online. I am hoping someone here will be able to help me out with this.
I am using powershell to start 8 jobs, each job running FFmpeg to stream a 7 minute file to a remote RTMP server. This is pulling from a file on the disk and each job uses a different file. The command is in a do while loop so that it is constantly restreaming.
This is causing the shell I launched the jobs from to accumulate a massive amount of memory, consuming all that it can. In 24 hours it consumed 30 of the 32 GB of my server.
Here is my launch code, any help would be appreciated.
start-job -Name v6 -scriptblock {
do { $d = $true; $f = Invoke-Expression -Command "ffmpeg -re -i `"C:\Shares\Matthew\180p_3000k.mp4`" -vcodec copy -acodec copy -f flv -y rtmp://<ip>/<appName>/<streamName>"; $f = $null }
while ($d = $true)
}
I've tried to receive the jobs and pipe it to out-null, I've tried setting $f to $null before starting the do while loop, and some other things I found online but to no avail. Thanks everyone for your time!
Better late than never I guess. I've had the same problem with huge memory consumption when running ffmpeg in Powershell jobs. The core of the issue is that a Powershell job will store any/all output into memory, and ffmpeg is extremely happy to log output to both standard output and standard error streams.
My solution was to add the parameter "-loglevel quiet" to ffmpeg. Alternatively you could redirect both the standard and error streams to null (it's not enough to redirect just the standard stream). For more on how to redirect the standard streams, refer to this question: Redirection of standard and error output appending to the same log-file