I am trying to pass a the whole of a manger object to a process, but I am unable to do so. When I try to I get this error:
Traceback (most recent call last):
File "TwitchMarketDatabase.py", line 46, in <module>
multiprocessing.Process(target=accept_thread, args=(SERVER, M)).start()
File "C:\Program Files\Python36\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Program Files\Python36\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Program Files\Python36\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Program Files\Python36\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
File "C:\Program Files\Python36\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
File "C:\Program Files\Python36\lib\multiprocessing\connection.py", line 939, in reduce_pipe_connection
dh = reduction.DupHandle(conn.fileno(), access)
File "C:\Program Files\Python36\lib\multiprocessing\connection.py", line 170, in fileno
self._check_closed()
File "C:\Program Files\Python36\lib\multiprocessing\connection.py", line 136, in _check_closed
raise OSError("handle is closed")
OSError: handle is closed
C:\projects\twitch-market>pause
Press any key to continue . . . Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Program Files\Python36\lib\multiprocessing\spawn.py", line 99, in spawn_main
new_handle = reduction.steal_handle(parent_pid, pipe_handle)
File "C:\Program Files\Python36\lib\multiprocessing\reduction.py", line 82, in steal_handle
_winapi.PROCESS_DUP_HANDLE, False, source_pid)
OSError: [WinError 87] The parameter is incorrect
I'm trying to make a handler for connections to a python program which write to files concurrently using processes unless one is being written to. I guess this is a pretty stupid way of doing it but I don't see any other way. So I intend on using the M (manager) object to spawn locks for each file if it it being written to. I would then add the lock to the dictionary. This is why I need to pass M to the accept connections thread. As I said, this is a really convoluted and stupid way of doing it and I am open to suggestions as to what I could use instead.
if __name__ == '__main__':
SERVER = socket.socket()
SERVER.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
SERVER.bind(('127.0.0.1', 9988))
M = multiprocessing.Manager()
M.queues = M.dict()
multiprocessing.Process(target=accept_thread, args=(SERVER, M)).start()
print('Server Started.')
input()
You cannot pass a socket object as an argument to a process. You need to pass proxy references to mutable in-memory objects to communicate between a process and its genesis.
You can approach this in multiple ways:
Create a Pipe with the manager, pass a reference to the process. Write into the pipe with the process, and in the genesis receive that data. This is the best way if you need to pass raw data from a process to the genesis.
Create a Queue and add content to the queue. Receive it on the other end.
Create a file-like object and put it inside a namespace created by the manager. Pass the namespace reference to the process as an argument.
If you need to pass that data to yet another process, you can still pass it through the socket you have created. However, for this case it is better if you actually create a socket object inside the process and you communicate directly to the open socket you want.
There are more complex ways if your processes are across the network like establishing a cross-network manager that works like a rest server with sockets and has the processes built on top. However, this is unnecessary if you just need to communicate between a process and a genesis process that are run by the same file.