Python multiprocessing continuously spawns pythonw.exe processes without doing any actual work Python multiprocessing continuously spawns pythonw.exe processes without doing any actual work windows windows

Python multiprocessing continuously spawns pythonw.exe processes without doing any actual work


You need to protect then entry point of the program by using if __name__ == '__main__':.

This is a Windows specific problem. On Windows your module has to be imported into a new Python interpreter in order for it to access your target code. If you don't stop this new interpreter running the start up code it will spawn another child, which will then spawn another child, until it's pythonw.exe processes as far as the eye can see.

Other platforms use os.fork() to launch the subprocesses so don't have the problem of reimporting the module.

So your code will need to look like this:

from multiprocessing import Processimport sysdef func(x):    print 'works ', x + 2    sys.stdout.flush()if __name__ == '__main__':    p = Process(target= func, args= (2, ))    p.start()    p.join()    p.terminate()    print 'done'    sys.stdout.flush()


According to the programming guidelines for multiprocessing, on windows you need to use an if __name__ == '__main__':


Funny, works on my Linux machine:

$ python mp.pyworks  4done$

Is the multiprocessing thing supposed to work on Windows? A lot of programs originated in the Unix world don't handle Windows so well, because Unix uses fork(2) to clone processes quite cheaply, but (it is my understanding) that Windows does not support fork(2) gracefully, if at all.