首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Python多进程同步

Python多进程同步
EN

Stack Overflow用户
提问于 2014-08-19 01:44:05
回答 2查看 9.4K关注 0票数 3

我有一个函数" function“,我想使用2乘以5个cpus的多进程调用它10次。

因此,我需要一种同步进程的方法,如下面的代码所述。

在不使用多进程池的情况下,这是可能的吗?如果我这样做了,我会得到奇怪的错误(例如"UnboundLocalError:本地变量'fd‘在赋值之前引用“(我没有这样的变量))。而且,这些进程似乎是随机终止的。

如果可能的话,我想在没有游泳池的情况下做到这一点。谢谢!

代码语言:javascript
复制
number_of_cpus = 5
number_of_iterations = 2

# An array for the processes.
processing_jobs = []

# Start 5 processes 2 times.
for iteration in range(0, number_of_iterations):

    # TODO SYNCHRONIZE HERE

    # Start 5 processes at a time.
    for cpu_number in range(0, number_of_cpus):

        # Calculate an offset for the current function call.
        file_offset = iteration * cpu_number * number_of_files_per_process

        p = multiprocessing.Process(target=function, args=(file_offset,))
        processing_jobs.append(p)
        p.start()

    # TODO SYNCHRONIZE HERE

这是我在池中运行代码时得到的错误的(匿名)回溯:

代码语言:javascript
复制
Process Process-5:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "python_code_3.py", line 88, in function_x
    xyz = python_code_1.function_y(args)
  File "/python_code_1.py", line 254, in __init__
    self.WK =  file.WK(filename)
  File "/python_code_2.py", line 1754, in __init__
    self.__parse__(name, data, fast_load)
  File "/python_code_2.py", line 1810, in __parse__
    fd.close()
UnboundLocalError: local variable 'fd' referenced before assignment

大多数进程都会像这样崩溃,但不是所有进程。当我增加进程的数量时,似乎会有更多的进程崩溃。我还认为这可能是由于内存限制……

EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2014-08-19 02:27:44

以下是如何在不使用池的情况下执行您正在寻找的同步:

代码语言:javascript
复制
import multiprocessing

def function(arg):
    print ("got arg %s" % arg)

if __name__ == "__main__":
    number_of_cpus = 5
    number_of_iterations = 2

    # An array for the processes.
    processing_jobs = []

    # Start 5 processes 2 times.
    for iteration in range(1, number_of_iterations+1):  # Start the range from 1 so we don't multiply by zero.

        # Start 5 processes at a time.
        for cpu_number in range(1, number_of_cpus+1):

            # Calculate an offset for the current function call.
            file_offset = iteration * cpu_number * number_of_files_per_process

            p = multiprocessing.Process(target=function, args=(file_offset,))
            processing_jobs.append(p)
            p.start()

        # Wait for all processes to finish.
        for proc in processing_jobs:
            proc.join()

        # Empty active job list.
        del processing_jobs[:]

        # Write file here
        print("Writing")

这是带有Pool

代码语言:javascript
复制
import multiprocessing

def function(arg):
    print ("got arg %s" % arg)

if __name__ == "__main__":
    number_of_cpus = 5
    number_of_iterations = 2

    pool = multiprocessing.Pool(number_of_cpus)
    for i in range(1, number_of_iterations+1): # Start the range from 1 so we don't multiply by zero
        file_offsets = [number_of_files_per_process * i * cpu_num for cpu_num in range(1, number_of_cpus+1)] 
        pool.map(function, file_offsets)
        print("Writing")
        # Write file here

如您所见,Pool解决方案更好。

但是,这并不能解决您的回溯问题。在不了解实际原因的情况下,我很难说出如何解决这个问题。您可能需要使用multiprocessing.Lock来同步对资源的访问。

票数 1
EN

Stack Overflow用户

发布于 2014-08-19 01:54:33

Pool可以非常容易地使用。下面是一个完整的示例:

来源

代码语言:javascript
复制
import multiprocessing

def calc(num):
    return num*2

if __name__=='__main__':  # required for Windows
    pool = multiprocessing.Pool()   # one Process per CPU
    for output in pool.map(calc, [1,2,3]):
        print 'output:',output

输出

代码语言:javascript
复制
output: 2
output: 4
output: 6
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/25369078

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档