Search code examples
c#windowsprocessperformance-testing

Integrity not maintained Working Set and Peak Working Set Values for a process


[Background]

  1. I am using System.Diagnostics.Process assembly under .net to track the performance of a process.
  2. I am taking 30 samples at an interval of 1 sec
  3. Value of Working set populated as:

long peakWorkingSet = requiredProcess.PeakWorkingSet64 where

Process requiredProcess = Process.GetProcessesByName(processName).First();
  1. Value of Peak Working set populated with the same process instance as:

long WorkingSet = requiredProcess.WorkingSet64

[Query]

I expect the PeakWorkingSet64 be the peak value of memory associated which is represented by WorkingSet64(Please correct if I am mistaken here)

But for some reason, I see the value of PeakWorkingSet64 as 80K when in fact the sample data suggests values of WorkingSet64 never reached that value. They were fluctuating around 50K.

Any inputs appreciated.Please help understand


Solution

  • Not sure what to say about that (or why I in particular should be able to divine).

    PeakWorkingSet64 indeed holds, as you expect, the peak value of the entire history of WorkingSet64 as specified in the MSDN docs:

    The maximum amount of physical memory, in bytes, allocated for the associated process since it was started.

    Note that this means "from since the process was spawned", which e.g. includes the period of time during which the runtime initializes.

    Now, you have tried to measure the memory consumption by taking a small number (30) of discrete samples, in an interval of one second each. This is not a very reliable way of measuring. All you know from those samples is what the working set looked like at the precise time you looked at it, not what it looked like a moment earlier or a moment later.
    The working set might incidentially always be around 50kiB when you look at it once per second, but might be 80k at another time (or any other value) in between the samples. Working sets are not constant, they change all the time.
    Further, and more likely, the working set might be much larger during startup (that is, before your code is even executed!). Thus, the peak value would naturally be higher, but you never get to "measure" such a high value with your samples, even if you do a million samples.