如何通过dockercompose将变量从Makefile传递到Python脚本

2024-04-24 21:32:47 发布

您现在位置:Python中文网/ 问答频道 /正文

我有一个运行docker compose的Makefile,它有一个执行python脚本的容器。我希望能够在命令行中将变量传递给Makefile,并在python脚本(testing.py)中打印它

我的目录看起来像:

main_folder:
  -docker-compose.yaml
  -Makefile
  -testing.py

我尝试了以下配置。生成文件是:

.PHONY: run run-prod stop stop-prod rm

run:
    WORKING_DAG=$(working_dag) docker-compose -f docker-compose.yml up -d --remove-orphans --build --force-recreate

docker compose是:

version: "3.7"
services:
  prepare_files:
    image: apache/airflow:1.10.14
    environment:
      WORKING_DAG: ${working_dag}
      PYTHONUNBUFFERED: 1
    entrypoint: /bin/bash
    command: -c "python3 testing.py $$WORKING_DAG"

文件testing.py是:

import sys

print(sys.argv[0], flush=True)

当我在命令行中运行时:

 make working_dag=testing run

它不会失败,但也不会打印任何内容。我怎么能做到?谢谢


Tags: 文件composedockerrun命令行py脚本prod
2条回答

您需要将testing.py装入容器(使用volumes)。在下面的示例中,将使用当前工作目录(${PWD}),并将testing.py装入容器的根目录中:

version: "3.7"
services:
  prepare_files:
    image: apache/airflow:1.10.14
    volumes:
      - ${PWD}/testing.py:/testing.py
    environment:
      PYTHONUNBUFFERED: 1
    entrypoint: /bin/bash
    command: -c "python3 /testing.py ${WORKING_DAG}"

NOTE There's no need to include WORKING_DAG in the service definition as it's exposed to the Docker Compose environment by your Makefile. Setting it as you did, overwrites it with "" (empty string) because ${working_dag} was your original environment variable but you remapped this to WORKING_DAG in your Makefile run step.

import sys

print(sys.argv[0:], flush=True)

然后:

make  always-make working_dag=Freddie run
WORKING_DAG=Freddie docker-compose  file=./docker-compose.yaml up
Recreating 66014039_prepare_files_1 ... done
Attaching to 66014039_prepare_files_1
prepare_files_1  | ['/testing.py', 'Freddie']
66014039_prepare_files_1 exited with code 0

我相信变量WORKING_DAG是通过命令行正确分配的,并且Makefile是将其正确传递给docker-compose。我通过运行容器来验证它,使其不被销毁,然后在登录到容器后,我检查了WORKING_DAG的值:

为了在docker执行完成后不销毁容器,我修改了docker-compose.yml,如下所示:

version: "3.7"
services:
  prepare_files:
    image: apache/airflow:1.10.14
    environment:
      WORKING_DAG: ${working_dag}
      PYTHONUNBUFFERED: 1
    entrypoint: /bin/bash
    command: -c "python3 testing.py $$WORKING_DAG"
    command: -c "tail -f /dev/null"

airflow@d8dcb07c926a:/opt/airflow$ echo $WORKING_DAG
testing

使用docker-compose部署时docker不显示Python的std.out的问题已在Github here中进行了注释,但仍未解决。只有当我们将文件传输/装载到容器中,或者使用Dockerfile时,才能使它在使用docker-compose时工作

使用Dockerfile时,只需按如下方式运行相应的脚本

CMD ["python", "-u", "testing.py", "$WORKING_DAG"]

要将脚本装载到容器中,请查看@DazWilkin的答案here

相关问题 更多 >