芹菜任务未运行且卡在挂起状态

2024-05-23 18:40:12 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在网上学习各种教程之一,并使用Docker/Docker Compose设置一个Flask/RabbitMQ/Celery应用程序。容器似乎都成功运行,但当我到达端点时,应用程序暂停。该任务似乎陷入了PENDING状态,从未真正完成。Docker输出中没有错误,所以我真的很困惑为什么这不起作用。当我到达端点时,我看到的唯一输出是:

rabbit_1    | 2021-05-13 01:38:07.942 [info] <0.760.0> accepting AMQP connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.943 [info] <0.760.0> connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'
rabbit_1    | 2021-05-13 01:38:07.952 [info] <0.776.0> accepting AMQP connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.953 [info] <0.776.0> connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'

我真的不确定我做错了什么,因为文档没有太大帮助

Dockerfile

FROM python:3
COPY ./requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
COPY . /app
ENTRYPOINT [ "python" ]
CMD ["app.py","--host=0.0.0.0"]

烧瓶app.py

from workerA import add_nums
from flask import (
   Flask,
   request,
   jsonify,
)
app = Flask(__name__)


@app.route("/add")
def add():
    first_num = request.args.get('f')
    second_num = request.args.get('s')
    result = add_nums.delay(first_num, second_num)
    return jsonify({'result': result.get()}), 200



if __name__ == '__main__':
    app.run(debug=True, host='0.0.0.0')

芹菜workerA.py

from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerA', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)


@celery.task()
def add_nums(a, b):
   return a + b

docker-compose.yml

version: "3"
services:
  web:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    ports:
      - "5000:5000"
    depends_on:
      - rabbit
    volumes:
      - .:/app
  rabbit:
    hostname: rabbit
    image: rabbitmq:management
    environment:
      - RABBITMQ_DEFAULT_USER=rabbitmq
      - RABBITMQ_DEFAULT_PASS=rabbitmq
    ports:
      - "5673:5672"
      - "15672:15672"
  worker_1:
    build:
      context: .
    hostname: worker_1
    entrypoint: celery
    command: -A workerA worker --loglevel=info -Q workerA
    volumes:
      - .:/app
    links:
      - rabbit
    depends_on:
      - rabbit


Tags: dockerinfotxtaddappflaskrabbitmqconnection
1条回答
网友
1楼 · 发布于 2024-05-23 18:40:12

好的,经过大量研究,我确定问题在于任务的队列名称。芹菜使用了队列的默认名称,这导致了一些问题。我这样调整了我的头发:

@celery.task(queue='workerA')
def add_nums(a, b):
   return a + b

现在它工作了

相关问题 更多 >