有没有办法等待从当前脚本调用的另一个 python 脚本(使用 subprocess.Propen())直到完成?

Is there a way to wait for another python script called from current script (using subprocess.Propen()) till its complete?

我正在尝试 运行 2 python 脚本(比如 dp_01.py 和 dp_02.py)来自主 python 脚本。我想一个接一个地处决他们。这是我当前在 master python script

中的代码
job1_exec = "python dp_01.py"
try:
    #os.system(job1_exec)
    command1 = subprocess.Popen(job1_exec, shell=True)
    command1.wait()
except:
    print("processing of dp_01 code failed")
print("Processing of dp_01 code completed")


job2_exec = "python dp_02.py"
try:
    #os.system(job2_exec)
    command2 = subprocess.Popen(job2_exec, shell=True)
    command2.wait()
except:
    print("processing of dp_02 code failed")
print("Processing of dp_02 code completed")

这里的问题是,主脚本没有等待 dp_01.py 完成它的执行。它立即开始执行 dp_02.py。 如何在 dp_02.py 开始执行之前等待 dp_01.py 执行完成?

一个解决方案是将 Popen 替换为 check_outputruncheck_outputrun 的包装器,它保留和 returns 子进程' stdout,并且在 运行.

时阻塞主线程

根据,

The main difference [between 'run' and 'Popen'] is that subprocess.run executes a command and waits for it to finish, while with subprocess.Popen you can continue doing your stuff while the process finishes and then just repeatedly call subprocess.communicate yourself to pass and receive data to your process.

让我们考虑两个不同的工作,其中第 1 个工作比第 2 个工作花费更长的时间,这里由 sleep(7)

模拟
# dp_01.py
import time
time.sleep(7)
print("--> Hello from dp_01", end="")

并且,

# dp_02.py
print("--> Hello from dp_02", end="")

然后为了测试方便,我把主脚本的job-performance移到函数中,

import time
import subprocess

jobs = ["dp_01.py", "dp_02.py"]

# The current approach of the OP, using 'Popen':
def do(job):
  subprocess.Popen("python "+job, shell=True)


# Alternative, main-thread-blocking approach,
def block_and_do(job):
  out = subprocess.check_output("python "+job, shell=True)
  print(out.decode('ascii')) # 'out' is a byte string


# Helper function for testing the synchronization
# of the two different job-workers above
def test_sync_of_function(worker_func, jobs):
  test_started = time.time()
  for job in jobs:
    print("started job '%s', at time %d" % (job, time.time() - test_started))
    worker_func(job)
    print("completed job '%s', at time %d" % (job, time.time() - test_started))
    time.sleep(1)

这导致:

test_sync_of(do, jobs)
starting job 'dp_01.py', at time 0
completed job 'dp_01.py', at time 0
starting job 'dp_02.py', at time 1
completed job 'dp_02.py', at time 1
 --> Hello from dp_02 !
 --> Hello from dp_01 !

test_sync_of(block_and_do, jobs)
starting job 'dp_01.py', at time 0
 --> Hello from dp_01 !
completed job 'dp_01.py', at time 7
starting job 'dp_02.py', at time 8
 --> Hello from dp_02 !
completed job 'dp_02.py', at time 8

最后,希望能解决你的问题。但是,这可能不是解决您的大问题的最佳方法?您可能想仔细看看 multiprocessing 模块。也许您在单独脚本中的作业可以作为模块导入并且它们的工作线程化?

最后一点:使用 shell=True 时应该非常小心:有关详细信息,请参阅其他 SO question