I have some class:
class Example:
def __init__(self):
...
def method(self):
...
And a class whose attribute is a list of the previous class:
class Main:
def __init__(self, elements:list):
self.elements : list[Example] = elements
Suppose I have main = Main(elements=[example1, example2, example3])
. From the Main
class, how can I execute the Example
class method for all instances without a for loop and additionally in parallel?
Something like:
with multiprocessing.Pool() as p:
p.map(main.elements.method()) # Like [example1.method(), example2.method(), example3.method()]
For the execution of all instances this answer using the getattr()
function looks good but then how can it be parallelized?
Thanks in advance!
You could make the "target" of the multiprocessing.pool.Pool.map
call a function that delegates to the method
method of its passed instance reference. Then you just specify as the iterable argument to map
the list of Example
instances you wish to have method method
called in parallel:
from multiprocess import Pool
class Example:
def __init__(self, x):
self.x = x
...
def method(self):
print('method called for instance:', self.x)
class Main:
def __init__(self, elements:list):
self.elements : list[Example] = elements
def method_delegator(o):
return o.method()
if __name__ == '__main__':
main = Main(elements=[Example(1), Example(2), Example(3)])
with Pool(3) as pool:
pool.map(method_delegator, main.elements)
Prints:
method called for instance: 1
method called for instance: 2
method called for instance: 3
Or (and thanks to AKX for this) you can use function operator.methodcaller
instead of creating a special delegator function:
from multiprocess import Pool
from operator import methodcaller
class Example:
def __init__(self, x):
self.x = x
...
def method(self):
print('method called for instance:', self.x)
class Main:
def __init__(self, elements:list):
self.elements : list[Example] = elements
if __name__ == '__main__':
main = Main(elements=[Example(1), Example(2), Example(3)])
with Pool(3) as pool:
pool.map(methodcaller('method'), main.elements)
Answer to New Question
It probably would have been better for you to open a new question rather than piggy-back a new question on top of an old one; the two questions are not really related.
When you make a method call on an Example
instance in a child process, a copy of the instance as it exists in the main process is created in the child process. How this is done depends on whether the spawn or fork method is used to create new processes. So unless what you are passing is sharable across processes (for example, it might reside in shared memory or it may be a managed object - see multiprocessing.Manager
class), then any updates made by your worker function method
is not modifying the copy that resides in the main process.
In this specific example, the easiest solution I can think of is to have method
return the values that need to be updated on the message
and y
attributes and then the main process can do the actual updating of the Element
instances. For example:
from multiprocess import Pool
from operator import methodcaller
class Example:
def __init__(self, x):
self.x = x
...
def __repr__(self):
return str(self.__dict__)
def method(self):
# Return message and y attributes to be updated by the
# main process
return f'method called for instance: {self.x}', None
class Main:
def __init__(self, elements:list):
self.elements : list[Example] = elements
if __name__ == '__main__':
elements = [Example(1), Example(2), Example(3)]
main = Main(elements=elements)
with Pool(3) as pool:
for idx, result in enumerate(pool.map(methodcaller('method'), main.elements)):
element = elements[idx]
element.message, element.y = result # Unpack
print(element)
Prints:
{'x': 1, 'message': 'method called for instance: 1', 'y': None}
{'x': 2, 'message': 'method called for instance: 2', 'y': None}
{'x': 3, 'message': 'method called for instance: 3', 'y': None}