Search code examples
pythonperformanceprocesscelerymonitoring

Is there a way to track CPU and memory usage of sub processes from another sub process


I want to start a celery process that always runs in the background. I want these running process to be able to track system resource (CPU, RAM, etc) usage of all other processes that will be started when a new task is needed to be executed in parallel. And then I want the tracking process to log the data to file at a regular interval.

If the above implementation is possible, is it safe to use in production setup?

I haven't tried them, but I have read about psutil and py-spy. But I am not sure if they can be used to track processes from another parallel process


Solution

  • psutil sure can.

    
    import psutil
    
    p=psutil.Process(121000) # a pid you want to track. Father, son, brother, unrelated. It doesn't matter
    p.cpu_percent()
    # 1.4 my process use 1.4% of a cpu
    p.cpu_times()
    # pcputimes(user=42.4, system=0.01, children_user=0.0, children_system=0.0, iowait=0.01)
    # Or detail
    p.cpu_times().user, p.cpu_times().system
    # Time spend so far, in sec
    
    p.memory_percent()
    #0.14 (0,14% memory used)