TypeError:无法用RQ(Redis队列)pickle“\u thread.lock”对象[Python3.9&3.8&3.7]

2024-04-28 19:32:16 发布

您现在位置:Python中文网/ 问答频道 /正文

我一直在尝试使用rq对来自BigQuery的API请求进行排队,因为它们花费了太多的时间,导致出现H12(超时)错误。当一个队列将数据帧传递给下一个队列时,代码不断崩溃

这是我的worker.py文件:

import os

import redis
from rq import Worker, Queue, Connection

listen = ['high', 'default', 'low']

redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')

conn = redis.from_url(redis_url)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(map(Queue, listen))
        worker.work()

这就是错误产生的原因:

data_full = q.enqueue(furnish_stops)
daily_count = q.enqueue(furnish_daily, data_full)

第一个函数只需调用api将数据下载到data_full dataframe,然后将其传递给另一个函数,以创建一个用于可视化目的的数组

完整错误报告:

Traceback (most recent call last):
  File "app copy.py", line 29, in <module>
    daily_count = q.enqueue(furnish_daily, data_full)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 502, in enqueue
    return self.enqueue_call(
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 400, in enqueue_call
    return self.enqueue_job(job, pipeline=pipeline, at_front=at_front)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 560, in enqueue_job
    job.save(pipeline=pipe)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 648, in save
    mapping = self.to_dict(include_meta=include_meta)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 590, in to_dict
    'data': zlib.compress(self.data),
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 270, in data
    self._data = self.serializer.dumps(job_tuple)
TypeError: cannot pickle '_thread.lock' object

我已经在python版本3.7.11&;3.8.10及;3.9.6所有人都得到相同的错误

关于类似问题解决方案的唯一其他提及是在本thread中,但降级到3.7的解决方案对我不起作用


Tags: inpyselfhomedatalibpackageslocal