Celery在使用scrapyDjangoitem时无法启动工作进程

2024-06-11 01:54:46 发布

您现在位置:Python中文网/ 问答频道 /正文

问题详细信息:https://github.com/celery/celery/issues/3598

我想用芹菜做一只刮屑蜘蛛,里面有姜汁。在

这是我的芹菜任务:

# coding_task.py
import sys

from celery import Celery
from collector.collector.crawl_agent import crawl

app = Celery('coding.net', backend='redis', broker='redis://localhost:6379/0')
app.config_from_object('celery_config')


@app.task
def period_task():
    crawl()

collector.collector.crawl_agent.crawl包含一个使用djangoitem作为项的爬虫程序。 项目包括:

^{pr2}$

当运行:celery -A coding_task worker --loglevel=info --concurrency=1时,它将得到以下错误:

[2016-11-16 17:33:41,934: ERROR/Worker-1] Process Worker-1
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/billiard/process.py", line 292, in _bootstrap
    self.run()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 292, in run
    self.after_fork()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 395, in after_fork
    self.initializer(*self.initargs)
  File "/usr/local/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 80, in process_initializer
    signals.worker_process_init.send(sender=None)
  File "/usr/local/lib/python2.7/site-packages/celery/utils/dispatch/signal.py", line 151, in send
    response = receiver(signal=self, sender=sender, **named)
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 152, in on_worker_process_init
    self._close_database()
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 181, in _close_database
    funs = [self._db.close_connection]  # pre multidb
AttributeError: 'module' object has no attribute 'close_connection'
[2016-11-16 17:33:41,942: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-11-16 17:33:41,957: INFO/MainProcess] mingle: searching for neighbors
[2016-11-16 17:33:42,962: INFO/MainProcess] mingle: all alone
/usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-11-16 17:33:42,968: WARNING/MainProcess] /usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-11-16 17:33:42,968: WARNING/MainProcess] celery@MacBook-Pro.local ready.
[2016-11-16 17:33:42,969: ERROR/MainProcess] Process 'Worker-1' pid:2777 exited with 'exitcode 1'
[2016-11-16 17:33:42,991: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 208, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 378, in start
    return self.obj.start()
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 271, in start
    blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 766, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python2.7/site-packages/celery/worker/loops.py", line 50, in asynloop
    raise WorkerLostError('Could not start worker processes')
WorkerLostError: Could not start worker processes

如果我删除项目中的djangoitem:

from scrapy.item import Item
class CodingItem(item):
    amount = scrapy.Field(default=0)
    role = scrapy.Field()
    type = scrapy.Field()
    duration = scrapy.Field()
    detail = scrapy.Field()
    extra = scrapy.Field()

这个任务会很好地运行,不会有任何错误。 如果我想在这个芹菜屑任务中使用djangitem,我应该怎么做?在

谢谢!在


Tags: inpyselflibpackagesusrlocalline