python multiprocessing: no diminishing returns? python multiprocessing: no diminishing returns? multithreading multithreading

python multiprocessing: no diminishing returns?


In fact, even when n<4, I see the scheduler frenetically rotating the processes through my processors, instead of assigning each process to its own processor and avoid context switching.

Processes are not processor-bounded by default, one of the main reason being to avoid unequal heating of the processor, which can cause mechanical stress and reduce its lifetime.

There are ways to enforce running a process on a single core (look at psutil module), which has advantages such as better use of cache memory and avoid context switching, but in most cases (if not all), you don't make a big difference in terms of performances.

So now if spawn more processes than your number of cores, they will just act as threads and switch between them to optimize execution. The processor performance will only be (very) slightly lowered, as you already were switching context with less than 4 processes.


Answering my own question:

Firstly, I seem to have committed an error in my post. It does not seem true that the CPU being used gets changed frantically. If I fire two CPU-intensive processes, they keep changing cores but only between two cores. My computer has 4 cores each of which has 2 "soft" cores (for hyperthreading). I guess what is going on is that it is changing between these 2 "soft" cores. It isn't Linux doing this, it is the CPU-board.

That being said, I am still surprised that context switching is not more of a pain than it is.

EDIT: There is a nice discussion, with better empirical work than me, over this blog.