<p><code>Pool()</code>的参数接受函数;用<code>initializer=init, initargs=(array,)</code>替换<code>initializer=init(array)</code></p>
<p>要将关键字参数传递给与<code>pool.*map*</code>系列一起使用的函数<code>f()</code>,可以创建包装器<code>mp_f()</code>:</p>
<pre class="lang-py prettyprint-override"><code>#!/usr/bin/env python
import logging
import multiprocessing as mp
from contextlib import closing
def init(shared_array_):
# globals that should be available in worker processes should be
# initialized here
global shared_array
shared_array = shared_array_
def f(interval, a=None, b=None):
mp.get_logger().info("interval=%r, a=%r, b=%r" % (interval, a, b))
shared_array[interval] = [a + interval.start]*b # fake computations
def mp_f(arg_kwargs):
try:
arg, kwargs = arg_kwargs
return f(arg, **kwargs) # pass keyword args to f()
except Exception:
mp.get_logger().error("f%r failed" % (arg_kwargs,))
def main():
mp.log_to_stderr().setLevel(logging.INFO)
N = 10**6
array = mp.RawArray('i', N) # create shared array
# create workers pool; use all available CPU cores
with closing(mp.Pool(initializer=init, initargs=(array,))) as p:
options = dict(a=5, b=N//4) # dummy options
step = options['b']
args = ((slice(i, i+step), options) for i in range(0, N, step))
for _ in p.imap_unordered(mp_f, args): # submit jobs
pass
p.join()
mp.get_logger().info(array[::step])
if __name__=="__main__":
mp.freeze_support() # for py2exe and the-like on Windows
main()
</code></pre>