多处理共享阵列

2024-05-16 10:03:42 发布

您现在位置:Python中文网/ 问答频道 /正文

所以我尝试在python中实现多处理,希望有一个4-5个进程池并行运行一个方法。其目的是运行总共1000个Monte模拟(每个进程250-200个模拟),而不是运行1000个。我希望每个进程在完成对一个模拟结果的处理、写入结果并释放锁后,立即获取一个锁,从而写入一个公共共享数组。因此,这应该是一个三步过程:

  1. 获取锁
  2. 写入结果
  3. 释放等待写入阵列的其他进程的锁。

每次我将数组传递给进程时,每个进程都会创建一个不需要的数组副本,因为我需要一个公共数组。有人能提供示例代码来帮助我吗?


Tags: 方法代码目的示例进程过程副本数组
2条回答

没有测试,但类似的东西应该可以。 数组和锁在进程之间共享。

from multiprocessing import Process, Array, Lock

def f(array, lock, n): #n is the dedicated location in the array
    lock.acquire()
    array[n]=-array[n]
    lock.release()

if __name__ == '__main__':
    size=100
    arr=Array('i', [3,-7])
    lock=Lock()
    p = Process(target=f, args=(arr,lock,0))
    q = Process(target=f, args=(arr,lock,1))
    p.start()
    q.start()
    q.join()
    p.join()

    print(arr[:])

这里的文档https://docs.python.org/3.5/library/multiprocessing.html有很多示例可以开始

因为您只将状态从子进程返回到父进程,所以使用共享数组和显式锁是过分的。你可以使用Pool.mapPool.starmap来完成你所需要的。例如:

from multiprocessing import Pool

class Adder:
    """I'm using this class in place of a monte carlo simulator"""

    def add(self, a, b):
        return a + b

def setup(x, y, z):
    """Sets up the worker processes of the pool. 
    Here, x, y, and z would be your global settings. They are only included
    as an example of how to pass args to setup. In this program they would
    be "some arg", "another" and 2
    """
    global adder
    adder = Adder()

def job(a, b):
    """wrapper function to start the job in the child process"""
    return adder.add(a, b)

if __name__ == "__main__":   
    args = list(zip(range(10), range(10, 20)))
    # args == [(0, 10), (1, 11), ..., (8, 18), (9, 19)]

    with Pool(initializer=setup, initargs=["some arg", "another", 2]) as pool:
        # runs jobs in parallel and returns when all are complete
        results = pool.starmap(job, args)

    print(results) # prints [10, 12, ..., 26, 28] 

相关问题 更多 >