Python input() 阻止子进程执行
Python input() blocks subprocesses from executing
我有一个接受用户输入的 Python 脚本。不同的用户输入触发不同的功能。这里讨论的功能是产生多个进程的功能。这是脚本,main.py
.
import time
import threading
import concurrent.futures as cf
def executeparallelprocesses():
numprocesses = 2
durationseconds = 10
futures = []
print('Submit jobs as new processes.')
with cf.ProcessPoolExecutor(max_workers=numprocesses) as executor:
for i in range(numprocesses):
futures.append(executor.submit(workcpu, 500, durationseconds))
print('job submitted')
print('all jobs submitted')
print('Wait for jobs to complete.', flush=True)
for future in cf.as_completed(futures):
future.result()
print('All jobs done.', flush=True)
def workcpu(x, durationseconds):
print('Job executing in new process.')
start = time.time()
while time.time() - start < durationseconds:
x * x
def main():
while True:
cmd = input('Press ENTER\n')
if cmd == 'q':
break
thread = threading.Thread(target=executeparallelprocesses)
thread.start()
time.sleep(15)
if __name__ == '__main__':
main()
当从终端调用此脚本时,它按预期工作(即,子进程执行)。具体来说,请注意以下示例 运行 中的两行 "Job executing in new process.":
(terminal prompt $) python3 main.py
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
Job executing in new process.
Job executing in new process.
All jobs done.
q
(terminal prompt $)
问题:
当从另一个程序调用脚本时,不会执行子进程。这是驱动程序脚本,driver.py
:
import time
import subprocess
from subprocess import PIPE
args = ['python3', 'main.py']
p = subprocess.Popen(args, bufsize=0, stdin=PIPE, universal_newlines=True)
time.sleep(1)
print('', file=p.stdin, flush=True)
time.sleep(1)
print('q', file=p.stdin, flush=True)
time.sleep(20)
请注意 "Job executing in new process." 如何不出现在以下示例 运行 的输出中:
(terminal prompt $) python3 driver.py
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
(terminal prompt $)
似乎 main.py
中的 cmd = input('Press ENTER\n')
语句正在阻塞并阻止子进程执行。奇怪的是,注释掉 driver.py
中的第二个 time.sleep(1)
语句会导致 main.py
子进程按预期生成。实现此 "work" 的另一种方法是在 main.py
的循环内添加 time.sleep(1)
,紧接在 thread.start()
.
之后
这个时间敏感的代码很脆弱。有没有可靠的方法来做到这一点?
问题在于您如何尝试使用 stdin=PIPE
与第二个脚本进行通信 - 尝试对第二个脚本执行以下操作:
import time
import subprocess
from subprocess import PIPE
args = ['python', 'junk.py']
p = subprocess.Popen(args, bufsize=0, stdin=PIPE, universal_newlines=True)
p.communicate(input='\nq\n')
time.sleep(20)
输出:
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
Job executing in new process.
Job executing in new process.
All jobs done.
Process finished with exit code 0
请注意,与其在各处插入超时,不如考虑加入已完成的进程,但这超出了问题范围。
我尝试了 ShadowRanger 的建议来添加对 multiprocessing.set_start_method()
的调用:
if __name__ == '__main__':
multiprocessing.set_start_method('spawn')
main()
这解决了我的问题。我将阅读 documentation 以了解更多信息。
我有一个接受用户输入的 Python 脚本。不同的用户输入触发不同的功能。这里讨论的功能是产生多个进程的功能。这是脚本,main.py
.
import time
import threading
import concurrent.futures as cf
def executeparallelprocesses():
numprocesses = 2
durationseconds = 10
futures = []
print('Submit jobs as new processes.')
with cf.ProcessPoolExecutor(max_workers=numprocesses) as executor:
for i in range(numprocesses):
futures.append(executor.submit(workcpu, 500, durationseconds))
print('job submitted')
print('all jobs submitted')
print('Wait for jobs to complete.', flush=True)
for future in cf.as_completed(futures):
future.result()
print('All jobs done.', flush=True)
def workcpu(x, durationseconds):
print('Job executing in new process.')
start = time.time()
while time.time() - start < durationseconds:
x * x
def main():
while True:
cmd = input('Press ENTER\n')
if cmd == 'q':
break
thread = threading.Thread(target=executeparallelprocesses)
thread.start()
time.sleep(15)
if __name__ == '__main__':
main()
当从终端调用此脚本时,它按预期工作(即,子进程执行)。具体来说,请注意以下示例 运行 中的两行 "Job executing in new process.":
(terminal prompt $) python3 main.py
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
Job executing in new process.
Job executing in new process.
All jobs done.
q
(terminal prompt $)
问题:
当从另一个程序调用脚本时,不会执行子进程。这是驱动程序脚本,driver.py
:
import time
import subprocess
from subprocess import PIPE
args = ['python3', 'main.py']
p = subprocess.Popen(args, bufsize=0, stdin=PIPE, universal_newlines=True)
time.sleep(1)
print('', file=p.stdin, flush=True)
time.sleep(1)
print('q', file=p.stdin, flush=True)
time.sleep(20)
请注意 "Job executing in new process." 如何不出现在以下示例 运行 的输出中:
(terminal prompt $) python3 driver.py
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
(terminal prompt $)
似乎 main.py
中的 cmd = input('Press ENTER\n')
语句正在阻塞并阻止子进程执行。奇怪的是,注释掉 driver.py
中的第二个 time.sleep(1)
语句会导致 main.py
子进程按预期生成。实现此 "work" 的另一种方法是在 main.py
的循环内添加 time.sleep(1)
,紧接在 thread.start()
.
这个时间敏感的代码很脆弱。有没有可靠的方法来做到这一点?
问题在于您如何尝试使用 stdin=PIPE
与第二个脚本进行通信 - 尝试对第二个脚本执行以下操作:
import time
import subprocess
from subprocess import PIPE
args = ['python', 'junk.py']
p = subprocess.Popen(args, bufsize=0, stdin=PIPE, universal_newlines=True)
p.communicate(input='\nq\n')
time.sleep(20)
输出:
Press ENTER
Submit jobs as new processes.
Press ENTER
job submitted
job submitted
all jobs submitted
Wait for jobs to complete.
Job executing in new process.
Job executing in new process.
All jobs done.
Process finished with exit code 0
请注意,与其在各处插入超时,不如考虑加入已完成的进程,但这超出了问题范围。
我尝试了 ShadowRanger 的建议来添加对 multiprocessing.set_start_method()
的调用:
if __name__ == '__main__':
multiprocessing.set_start_method('spawn')
main()
这解决了我的问题。我将阅读 documentation 以了解更多信息。