我需要在python中使用多进程,可是我发现不论是multiprocessing.pool.Pool还是concurrent.futures.ProcessPoolExecutor都没有提供对子进程的终止?我现在急需要这样的功能,该如何实现?
我有看ProcessPoolExecutor的源代码,ProcessPoolExecutor大概就是在submit后会把参数视为一个work_item,添加到call_queue中去,再由_process_worker取出call_item并运行,感觉也没法用hack的手段来实现终止啊。。
下面是ProcessPoolExecutor中的部分核心源代码。。
def _process_worker(call_queue, result_queue):
"""Evaluates calls from call_queue and places the results in result_queue.
This worker is run in a separate process.
Args:
call_queue: A multiprocessing.Queue of _CallItems that will be read and
evaluated by the worker.
result_queue: A multiprocessing.Queue of _ResultItems that will written
to by the worker.
shutdown: A multiprocessing.Event that will be set as a signal to the
worker that it should exit when call_queue is empty.
"""
while True:
call_item = call_queue.get(block=True)
if call_item is None:
# Wake up queue management thread
result_queue.put(os.getpid())
return
try:
r = call_item.fn(*call_item.args, **call_item.kwargs)
except BaseException as e:
exc = _ExceptionWithTraceback(e, e.__traceback__)
result_queue.put(_ResultItem(call_item.work_id, exception=exc))
else:
result_queue.put(_ResultItem(call_item.work_id,
result=r))
3 回答
MMTTMM
TA贡献1869条经验 获得超4个赞
.close() 温和地停止子进程,.terminate() 强制关。
不知道你是什么使用场景需要显式地用这种功能。我用 concurrent.futures 的时候从来没有过这种需求,直接用 with 语句。任务处理完了也就退出了。
喵喔喔
TA贡献1735条经验 获得超5个赞
### example
import os
import signal
def handle_sigterm(signum, frame):
# do stuff
os._exit(0)
# subprocess
signal.signal(signal.SGITERM, handle_sigterm)
# where to kill subprocess
os.kill(pid, signal.SIGTERM)
添加回答
举报
0/150
提交
取消