为了账号安全,请及时绑定邮箱和手机立即绑定

包含Django item 的 scrapy 爬虫不能正常在 celery 中运行

包含Django item 的 scrapy 爬虫不能正常在 celery 中运行

慕仙森 2019-02-23 20:18:08
我用 scarpy 写了一个爬虫,爬虫爬到的信息会用到 Djangoitem 存进一个 Django 项目的数据库中,当我单独运行爬虫时一切正常,但我想将这个爬虫作为一个任务加到任务队列中定期执行,我用 celery 进行任务管理: 这是我的 celery task: # coding_task.py import sys from celery import Celery from collector.collector.crawl_agent import crawl app = Celery('coding.net', backend='redis', broker='redis://localhost:6379/0') app.config_from_object('celery_config') @app.task def period_task(): crawl() collector.collector.crawl_agent.crawl 是一个包含 Django item 的爬虫, item 如下: import django os.environ['DJANGO_SETTINGS_MODULE'] = 'RaPo3.settings' django.setup() from scrapy_djangoitem import DjangoItem from xxx.models import Collection class CodingItem(DjangoItem): django_model = Collection amount = scrapy.Field(default=0) role = scrapy.Field() type = scrapy.Field() duration = scrapy.Field() detail = scrapy.Field() extra = scrapy.Field() 此时如果运行 :celery -A coding_task worker --loglevel=info --concurrency=1,就会出现如下错误: [2016-11-16 17:33:41,934: ERROR/Worker-1] Process Worker-1 Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/billiard/process.py", line 292, in _bootstrap self.run() File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 292, in run self.after_fork() File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 395, in after_fork self.initializer(*self.initargs) File "/usr/local/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 80, in process_initializer signals.worker_process_init.send(sender=None) File "/usr/local/lib/python2.7/site-packages/celery/utils/dispatch/signal.py", line 151, in send response = receiver(signal=self, sender=sender, **named) File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 152, in on_worker_process_init self._close_database() File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 181, in _close_database funs = [self._db.close_connection] # pre multidb AttributeError: 'module' object has no attribute 'close_connection' [2016-11-16 17:33:41,942: INFO/MainProcess] Connected to redis://localhost:6379/0 [2016-11-16 17:33:41,957: INFO/MainProcess] mingle: searching for neighbors [2016-11-16 17:33:42,962: INFO/MainProcess] mingle: all alone /usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments! warnings.warn('Using settings.DEBUG leads to a memory leak, never ' [2016-11-16 17:33:42,968: WARNING/MainProcess] /usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments! warnings.warn('Using settings.DEBUG leads to a memory leak, never ' [2016-11-16 17:33:42,968: WARNING/MainProcess] celery@MacBook-Pro.local ready. [2016-11-16 17:33:42,969: ERROR/MainProcess] Process 'Worker-1' pid:2777 exited with 'exitcode 1' [2016-11-16 17:33:42,991: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',) Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 208, in start self.blueprint.start(self) File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start step.start(parent) File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 378, in start return self.obj.start() File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 271, in start blueprint.start(self) File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start step.start(parent) File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 766, in start c.loop(*c.loop_args()) File "/usr/local/lib/python2.7/site-packages/celery/worker/loops.py", line 50, in asynloop raise WorkerLostError('Could not start worker processes') WorkerLostError: Could not start worker processes 应该是 Django 环境和 celery 的环境冲突了? 在 item 里去掉Django item 相关的东西再运行一切正常。 如果我想运行这个包含 Django item 的 celery-scrapy 任务,应该怎么做呢?谢谢! 原文:https://github.com/celery/cel...
查看完整描述

1 回答

?
白板的微信

TA贡献1883条经验 获得超3个赞

没人回答,自问自答吧。Github 上解决了。

这个是因为 Celery 3.1 对 Django 1.7 及以上的兼容性存在问题,当 Scrapy item 里用了 os.environ['DJANGO_SETTINGS_MODULE'] = 'RaPo3.settings' Celery 就会把这个 task 当做一个 Django 工程看待。

最直接的解决办法就是升级 Celery 到 4.0 之后到版本,修复了这个兼容性问题。

如果你用 redis 做 Celery 的 backend 支持,那么再升级到 4.0 的时候会遇到任务启动时连接 redis 出现 timeout 错误,这是 python 的 redis lib 版本过低导致的,确保升级 redis 的 python lib 到 2.10.4 及以后。

pip install redis==2.10.5 --upgrade

查看完整回答
反对 回复 2019-03-04
  • 1 回答
  • 0 关注
  • 979 浏览
慕课专栏
更多

添加回答

举报

0/150
提交
取消
意见反馈 帮助中心 APP下载
官方微信