TypeError: cannot pickle 'weakref' object
I just came to the same traceback and managed to solve it. It was due to that an object had a running or exited Process as a variable and it was starting another Process using that object.
Problem
This is a minimal code to produce your error:
import multiprocessingclass Foo: def __init__(self): self.process_1 = multiprocessing.Process(target=self.do_stuff1) self.process_2 = multiprocessing.Process(target=self.do_stuff2) def do_multiprocessing(self): self.process_1.start() self.process_2.start() def do_stuff1(self): print("Doing 1") def do_stuff2(self): print("Doing 2")if __name__ == '__main__': foo = Foo() foo.do_multiprocessing()[out]:Traceback (most recent call last): File "myfile.py", line 21, in <module> foo.do_multiprocessing() File "myfile.py", line 11, in do_multiprocessing self.process_2.start() File "...\lib\multiprocessing\process.py", line 121, in start self._popen = self._Popen(self) File "...\lib\multiprocessing\context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "...\lib\multiprocessing\context.py", line 327, in _Popen return Popen(process_obj) File "...\lib\multiprocessing\popen_spawn_win32.py", line 93, in __init__ reduction.dump(process_obj, to_child) File "...\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj)TypeError: cannot pickle 'weakref' objectDoing 1Traceback (most recent call last): File "<string>", line 1, in <module> File "...\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "...\lib\multiprocessing\spawn.py", line 126, in _main self = reduction.pickle.load(from_parent)EOFError: Ran out of input
So the issue is that Foo contains also the running/exited process foo.process_1 when it starts foo.process_2.
Solution 1
Set foo.process_1 to None or something else. Or store the Processes somewhere else than in foo to prevent being passed when starting process_2.
...def do_multiprocessing(self): self.process_1.start() self.process_1 = None # Remove exited process self.process_2.start()...
Solution 2
Remove the problematic variable (process_1) from pickling:
class Foo: def __getstate__(self): # capture what is normally pickled state = self.__dict__.copy() # remove unpicklable/problematic variables state['process_1'] = None return state...
This seems to be problem in newer Python versions. My own code worked fine for 3.7 but failed due to this issue in 3.9.
I tested your code (from recv_data). Since you join the processes and need them, you should do the solution 2 or store the processes somewhere else than in recv_data. Not sure what other problems your code has.