Search code examples
pythonmultithreadingterminalmultiprocessinggazebo-simu

How can I choose CPU to run python script?


I want to run 2 different program at the same time. First one is a python script. Other program is a simulation program (Gazebo). Briefly, I want to send command from python script to Gazebo simulation. I already know ROS and multiprocessing but my problem not about these. While Gazebo is working, python script's fps value decreasing. So I want to run python script by choosing an CPU without effect Gazebo. Also I want to run Gazebo by choosing another CPU core. My request different from multiprocessing because even I didn't connect python script to Gazebo, fps is decreasing. For example I am starting python script. After that I am starting Gazebo from another terminal. These are independent works. Even in this situation they are affecting each other. As a result, even if I use multiprocess they will effect each other. Am I wrong? What should I do for this problem.

Edit: I could also ask the question in different type: Let's say we have 2 different python scripts. Both python scripts use multiprocessing. I created 2 processes in both scripts. When I run these 2 files in different terminals, will 4 different CPUs be used in total? Briefly, is the process used by a python file different or the same as the process in other python code run from a different terminal?


Solution

  • While it's technically possible to pin a process to a specific set of cores (and I'm sure there's an equivalent on other OS's). It's extremely likely that it won't help because operating systems are already quite good at determining where and when to execute a process. If you are seeing slowdown when running the two scripts, it may be a wide variety of things, but I would look at not only cpu usage, but also disk activity, bus utilization (raspberry pi for example has very small bus capacity), bad interaction (waiting while doing nothing) between the two programs, etc.

    To answer your second question I think I have to address a misconception: a process is much more than the file that created it. A single file is generally the entry point for the process, but the OS will then read that file into memory, load libraries associated to it, and begin execution. At that point, you could delete the file, and the program would continue just fine until it tries to load additional resources that hadn't been preemptively loaded by the OS. If you were to try to start a second copy of the same executable, the OS would go through the same process of loading and execution, but it would start with a new chunk of memory (processes get their own private memory space). To that end every python process actually starts with the same executable (python.exe on windows), gets its own process id, and its own memory:

    enter image description here

    The python executable then happens to typically load a text file and do something with it (your_script.py), but that actually has less to do with how the process is managed. When you call multiprocessing in your script python will use some OS commands to start an entirely new process just like opening a new terminal window (though it won't generally get it's own actual gui window)