我似乎对BashOperator
有问题。我在Miniconda环境(python3.6)中使用condaforge上的包,使用安装在CentOS上的flow1.10。在
当我运行airflow test tutorial pyHi 2018-01-01
时,输出是“Hello world!”一如预期。在
但是,当我运行airflow test tutorial print_date 2018-01-01
或
airflow test tutorial templated 2018-01-01
什么也没发生。在
这是Linux shell输出:
(etl) [root@VIRT02 airflow]# airflow test tutorial sleep 2015-06-01
[2018-09-28 19:56:09,727] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-09-28 19:56:09,962] {models.py:258} INFO - Filling up the DagBag from /root/airflow/dags
我的DAG配置文件基于Airflow tutorial,如下所示。在
from airfl ow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from datetime import datetime, timedelta
import test
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2010, 1, 1),
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(
'tutorial',
'My first attempt',
schedule_interval=timedelta(days=1),
default_args=default_args,
)
# t1, t2 and t3 are examples of tasks created by instantiating operators
t1 = BashOperator(
task_id='print_date',
bash_command='date',
dag=dag)
t2 = BashOperator(
task_id='sleep',
bash_command='sleep 5',
retries=3,
dag=dag)
templated_command = """
{% for i in range(5) %}
echo "{{ ds }}"
echo "{{ macros.ds_add(ds, 7)}}"
echo "{{ params.my_param }}"
{% endfor %}
"""
t3 = BashOperator(
task_id='templated',
bash_command=templated_command,
params={'my_param': 'Parameter I passed in'},
dag=dag)
t4 = BashOperator(
task_id='hi',
bash_command = 'test.sh',
dag=dag,
)
t5 = PythonOperator(
task_id='pyHi',
python_callable=test.main,
dag=dag,
)
t2.set_upstream(t1)
t3.set_upstream(t1)
从技术上讲,并不是BashOperator不工作,而是在气流日志中看不到Bash命令的stdout。这是一个已知问题,已经在Airflow的问题跟踪器上提交了一张罚单:https://issues.apache.org/jira/browse/AIRFLOW-2674
BashOperator可以工作的证据是,如果您使用
^{1}$您必须等待5秒才能终止,这是Bash
sleep
命令的行为。在相关问题 更多 >
编程相关推荐