使用具有最大同时进程数的 multiprocessing.Process

Using multiprocessing.Process with a maximum number of simultaneous processes(使用具有最大同时进程数的 multiprocessing.Process)
本文介绍了使用具有最大同时进程数的 multiprocessing.Process的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

问题描述

我有 Python 代码:

from multiprocessing import Process

def f(name):
    print 'hello', name

if __name__ == '__main__':
    for i in range(0, MAX_PROCESSES):
        p = Process(target=f, args=(i,))
        p.start()

运行良好.但是,MAX_PROCESSES 是可变的,可以是 1512 之间的任何值.由于我只在具有 8 内核的机器上运行此代码,因此我需要确定是否可以限制允许同时运行的进程数.我查看了 multiprocessing.Queue,但它看起来不像我需要的 - 或者我可能错误地解释了文档.

which runs well. However, MAX_PROCESSES is variable and can be any value between 1 and 512. Since I'm only running this code on a machine with 8 cores, I need to find out if it is possible to limit the number of processes allowed to run at the same time. I've looked into multiprocessing.Queue, but it doesn't look like what I need - or perhaps I'm interpreting the docs incorrectly.

有没有办法限制同时运行的 multiprocessing.Process 的数量?

Is there a way to limit the number of simultaneous multiprocessing.Processs running?

推荐答案

使用 multiprocessing.Pool 可能是最明智的,它根据可用的最大内核数生成工作进程池您的系统,然后基本上在内核可用时提供任务.

It might be most sensible to use multiprocessing.Pool which produces a pool of worker processes based on the max number of cores available on your system, and then basically feeds tasks in as the cores become available.

标准文档中的示例 (http://docs.python.org/2/library/multiprocessing.html#using-a-pool-of-workers)显示也可以手动设置核心数:

The example from the standard docs (http://docs.python.org/2/library/multiprocessing.html#using-a-pool-of-workers) shows that you can also manually set the number of cores:

from multiprocessing import Pool

def f(x):
    return x*x

if __name__ == '__main__':
    pool = Pool(processes=4)              # start 4 worker processes
    result = pool.apply_async(f, [10])    # evaluate "f(10)" asynchronously
    print result.get(timeout=1)           # prints "100" unless your computer is *very* slow
    print pool.map(f, range(10))          # prints "[0, 1, 4,..., 81]"

如果您的代码中需要,知道有 multiprocessing.cpu_count() 方法来计算给定系统上的内核数量也很方便.

And it's also handy to know that there is the multiprocessing.cpu_count() method to count the number of cores on a given system, if needed in your code.

这是一些似乎适用于您的特定情况的代码草案:

Here's some draft code that seems to work for your specific case:

import multiprocessing

def f(name):
    print 'hello', name

if __name__ == '__main__':
    pool = multiprocessing.Pool() #use all available cores, otherwise specify the number you want as an argument
    for i in xrange(0, 512):
        pool.apply_async(f, args=(i,))
    pool.close()
    pool.join()

这篇关于使用具有最大同时进程数的 multiprocessing.Process的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

相关文档推荐

build conda package from local python package(从本地 python 包构建 conda 包)
How can I see all packages that depend on a certain package with PIP?(如何使用 PIP 查看依赖于某个包的所有包?)
How to organize multiple python files into a single module without it behaving like a package?(如何将多个 python 文件组织到一个模块中而不像一个包一样?)
Check if requirements are up to date(检查要求是否是最新的)
How to upload new versions of project to PyPI with twine?(如何使用 twine 将新版本的项目上传到 PyPI?)
Why #egg=foo when pip-installing from git repo(为什么从 git repo 进行 pip 安装时 #egg=foo)