因此,我试图在python中实现多进程,在那里我希望有一个由4-5个进程组成的池,并行运行一个方法。这样做的目的是要运行总共1000次蒙特特模拟(每个进程250到200次模拟),而不是运行1000次。我希望每个进程都可以通过获取一个锁来写入一个公共共享数组,只要它完成了对一个模拟结果的处理,编写结果并释放锁。因此,这应该是一个三个步骤的过程:
每次我将数组传递给进程时,每个进程都会创建该数组的副本,这是我不想要的,因为我想要一个公共数组。有人能通过提供示例代码来帮助我吗?
发布于 2016-08-24 12:58:04
因为您只将子进程的状态返回给父进程,那么使用共享数组和外接锁就太过分了。您可以使用Pool.map或Pool.starmap来完成所需的任务。例如:
from multiprocessing import Pool
class Adder:
"""I'm using this class in place of a monte carlo simulator"""
def add(self, a, b):
return a + b
def setup(x, y, z):
"""Sets up the worker processes of the pool.
Here, x, y, and z would be your global settings. They are only included
as an example of how to pass args to setup. In this program they would
be "some arg", "another" and 2
"""
global adder
adder = Adder()
def job(a, b):
"""wrapper function to start the job in the child process"""
return adder.add(a, b)
if __name__ == "__main__":
args = list(zip(range(10), range(10, 20)))
# args == [(0, 10), (1, 11), ..., (8, 18), (9, 19)]
with Pool(initializer=setup, initargs=["some arg", "another", 2]) as pool:
# runs jobs in parallel and returns when all are complete
results = pool.starmap(job, args)
print(results) # prints [10, 12, ..., 26, 28] 发布于 2016-08-24 11:46:49
没有经过测试,但类似的东西应该能起作用。数组和锁在进程之间共享。
from multiprocessing import Process, Array, Lock
def f(array, lock, n): #n is the dedicated location in the array
lock.acquire()
array[n]=-array[n]
lock.release()
if __name__ == '__main__':
size=100
arr=Array('i', [3,-7])
lock=Lock()
p = Process(target=f, args=(arr,lock,0))
q = Process(target=f, args=(arr,lock,1))
p.start()
q.start()
q.join()
p.join()
print(arr[:])这里的文档https://docs.python.org/3.5/library/multiprocessing.html有大量的示例可供开始
https://stackoverflow.com/questions/39122270
复制相似问题