芹菜“入门”无法检索结果; 总是等待

我一直试图按照芹菜和下一步的指导芹菜的 第一步 。 我的设置是Windows 7 64位,Anaconda Python 2.7(32位),安装Erlang 32位二进制文​​件,RabbitMQ服务器和芹菜(用pip install celery )。

遵循指南,我用init .py,tasks.py和celery.py创build了一个proj文件夹。 我的init .py是空的。 这里是celery.py:

 from __future__ import absolute_import from celery import Celery app = Celery('proj', broker='amqp://', backend='amqp://', include=['proj.tasks']) #Optional configuration, see the application user guide app.conf.update( CELERY_TASK_RESULT_EXPIRES=3600, CELERY_TASK_SERIALIZER='json', CELERY_ACCEPT_CONTENT=['json'], # Ignore other content CELERY_RESULT_SERIALIZER='json', ) if __name__ == '__main__': app.start() 

这里是tasks.py:

 from __future__ import absolute_import from .celery import app @app.task def add(x, y): return x + y @app.task def mul(x, y): return x * y @app.task def xsum(numbers): return sum(numbers) 

首先我明白我应该确保RabbitMQ服务正在运行。 任务pipe理器的服务标签显示RabbitMQ确实在运行。 要开始芹菜服务器并加载我的任务,我打开cmd.exe,导航到proj (我称为celery_demo的文件夹)的父项,然后运行:

 celery -A proj.celery worker -l debug 

这给出了这个输出:

 C:\Users\bnables\Documents\Python\celery_demo>celery -A proj.celery worker -l debug [2014-08-25 17:00:09,308: DEBUG/MainProcess] | Worker: Preparing bootsteps. [2014-08-25 17:00:09,313: DEBUG/MainProcess] | Worker: Building graph... [2014-08-25 17:00:09,315: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Queues (intra), Pool, Autoreloader, Autoscaler, StateDB, Beat, Con sumer} [2014-08-25 17:00:09,322: DEBUG/MainProcess] | Consumer: Preparing bootsteps. [2014-08-25 17:00:09,322: DEBUG/MainProcess] | Consumer: Building graph... [2014-08-25 17:00:09,332: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Heart, Mingle, Gossip, Tasks, Control, Agent, event loop } -------------- celery@MSSLW40013047 v3.1.13 (Cipater) ---- **** ----- --- * *** * -- Windows-7-6.1.7601-SP1 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: proj:0x3290370 - ** ---------- .> transport: amqp://guest:**@localhost:5672// - ** ---------- .> results: amqp - *** --- * --- .> concurrency: 8 (prefork) -- ******* ---- --- ***** ----- [queues] -------------- .> celery exchange=celery(direct) key=celery [tasks] . celery.backend_cleanup . celery.chain . celery.chord . celery.chord_unlock . celery.chunks . celery.group . celery.map . celery.starmap . proj.tasks.add . proj.tasks.mul . proj.tasks.xsum [2014-08-25 17:00:09,345: DEBUG/MainProcess] | Worker: Starting Pool [2014-08-25 17:00:09,417: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:09,420: DEBUG/MainProcess] | Worker: Starting Consumer [2014-08-25 17:00:09,421: DEBUG/MainProcess] | Consumer: Starting Connection [2014-08-25 17:00:09,457: DEBUG/MainProcess] Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.r abbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2014 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': Tru e, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'per_consumer_qos': True, u'consumer_priorities': True, u 'consumer_cancel_notify': True, u'publisher_confirms': True}, u'cluster_name': u'rabbit@MSSLW40013047.ndc.nasa.gov', u'platform': u'Erlang/OTP', u'ver sion': u'3.3.5'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US'] [2014-08-25 17:00:09,460: DEBUG/MainProcess] Open OK! [2014-08-25 17:00:09,460: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672// [2014-08-25 17:00:09,461: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:09,461: DEBUG/MainProcess] | Consumer: Starting Events [2014-08-25 17:00:09,516: DEBUG/MainProcess] Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.r abbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2014 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': Tru e, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'per_consumer_qos': True, u'consumer_priorities': True, u 'consumer_cancel_notify': True, u'publisher_confirms': True}, u'cluster_name': u'rabbit@MSSLW40013047.ndc.nasa.gov', u'platform': u'Erlang/OTP', u'ver sion': u'3.3.5'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US'] [2014-08-25 17:00:09,519: DEBUG/MainProcess] Open OK! [2014-08-25 17:00:09,520: DEBUG/MainProcess] using channel_id: 1 [2014-08-25 17:00:09,522: DEBUG/MainProcess] Channel open [2014-08-25 17:00:09,523: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:09,523: DEBUG/MainProcess] | Consumer: Starting Heart [2014-08-25 17:00:09,530: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:09,533: DEBUG/MainProcess] | Consumer: Starting Mingle [2014-08-25 17:00:09,538: INFO/MainProcess] mingle: searching for neighbors [2014-08-25 17:00:09,539: DEBUG/MainProcess] using channel_id: 1 [2014-08-25 17:00:09,540: DEBUG/MainProcess] Channel open [2014-08-25 17:00:10,552: INFO/MainProcess] mingle: all alone [2014-08-25 17:00:10,552: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:10,552: DEBUG/MainProcess] | Consumer: Starting Gossip [2014-08-25 17:00:10,553: DEBUG/MainProcess] using channel_id: 2 [2014-08-25 17:00:10,555: DEBUG/MainProcess] Channel open [2014-08-25 17:00:10,559: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:10,559: DEBUG/MainProcess] | Consumer: Starting Tasks [2014-08-25 17:00:10,566: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:10,566: DEBUG/MainProcess] | Consumer: Starting Control [2014-08-25 17:00:10,568: DEBUG/MainProcess] using channel_id: 3 [2014-08-25 17:00:10,569: DEBUG/MainProcess] Channel open [2014-08-25 17:00:10,572: DEBUG/MainProcess] ^-- substep ok [2014-08-25 17:00:10,573: DEBUG/MainProcess] | Consumer: Starting event loop [2014-08-25 17:00:10,575: WARNING/MainProcess] celery@MSSLW40013047 ready. [2014-08-25 17:00:10,575: DEBUG/MainProcess] basic.qos: prefetch_count->32 

-A告诉芹菜哪里可以find我的芹菜应用实例。 也只是使用proj作品,但因为它只会searchproj.celery ,在这里更明确的是罚款。 worker是给芹菜的命令,告诉它产生一些工人执行从proj.celery加载的任务。 最后-l debug告诉芹菜设置日志级别debugging,以便我得到大量的信息。 这通常是-l info

为了testing我的任务服务器,我打开一个IPython Qt控制台并导航到celery_demo文件夹(其中包含proj )。 然后我from proj.tasks import add 。 简单地调用add(1, 2)返回3而不使用服务器,如预期的那样。 当我调用add.delay时,会发生什么情况:

 add.delay(2, 3) 

哪个返回:

 <AsyncResult: 42123ff3-e94e-4673-808a-ec6c847679d8> 

在我的cmd.exe窗口中,我得到:

 [2014-08-25 17:20:38,109: INFO/MainProcess] Received task: proj.tasks.add[42123ff3-e94e-4673-808a-ec6c847679d8] [2014-08-25 17:20:38,109: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x033CD6F0> (args:(u'proj.tasks.add', u'42123ff3-e94e-4673- 808a-ec6c847679d8', [2, 3], {}, {u'timelimit': [None, None], u'utc': True, u'is_eager': False, u'chord': None, u'group': None, u'args': [2, 3], u'retr ies': 0, u'delivery_info': {u'priority': None, u'redelivered': False, u'routing_key': u'celery', u'exchange': u'celery'}, u'expires': None, u'hostname ': 'celery@MSSLW40013047', u'task': u'proj.tasks.add', u'callbacks': None, u'correlation_id': u'42123ff3-e94e-4673-808a-ec6c847679d8', u'errbacks': No ne, u'reply_to': u'70ed001d-193c-319c-9447-8d77c231dc10', u'taskset': None, u'kwargs': {}, u'eta': None, u'id': u'42123ff3-e94e-4673-808a-ec6c847679d8 ', u'headers': {}}) kwargs:{}) [2014-08-25 17:20:38,124: DEBUG/MainProcess] Task accepted: proj.tasks.add[42123ff3-e94e-4673-808a-ec6c847679d8] pid:4052 [2014-08-25 17:20:38,125: INFO/MainProcess] Task proj.tasks.add[42123ff3-e94e-4673-808a-ec6c847679d8] succeeded in 0.0130000114441s: 5 

如最后一行所示,5的结果正在计算中。 接下来,我想存储AsyncResult对象并检查它的状态并获得结果值:

 result = add.delay(3, 4) 

但是result.state和result.get(timeout = 1)不能按预期工作:

 In: result.state Out: 'Pending' In: result.status Out: 'Pending' In: result.get(timeout=1) --------------------------------------------------------------------------- TimeoutError Traceback (most recent call last) <ipython-input-17-375f2d3530cb> in <module>() ----> 1 result.get(timeout=1) C:\Anaconda32\lib\site-packages\celery\result.pyc in get(self, timeout, propagate, interval, no_ack, follow_parents) 167 interval=interval, 168 on_interval=on_interval, --> 169 no_ack=no_ack, 170 ) 171 finally: C:\Anaconda32\lib\site-packages\celery\backends\amqp.pyc in wait_for(self, task_id, timeout, cache, propagate, no_ack, on_interval, READY_STATES, PROPAGATE_STATES, **kwargs) 155 on_interval=on_interval) 156 except socket.timeout: --> 157 raise TimeoutError('The operation timed out.') 158 159 if meta['status'] in PROPAGATE_STATES and propagate: TimeoutError: The operation timed out. 

如果result.state或result.status的预期结果是'SUCCESSFUL', result.get(timeout=1)应该是5

看来结果存储或消息传递不能正常工作。 本教程简单地说,在调用Celery()CELERY_RESULT_BACKENDconfiguration设置中设置backend named参数。 在“入门”中有backend='amqp'backend='amqp'步骤有backend='amqp://' ,这也在github的例子中使用 。

有一段时间,我一直在嘲弄我的头脑,对于如何着手还没有一点线索。 任何想法下什么尝试? 谢谢!

Solutions Collecting From Web of "芹菜“入门”无法检索结果; 总是等待"

  • W8 x64
  • Python 2.7.3(ActivePython)
  • Erlang 17.1 x64
  • RabbitMQ服务器3.3.5
  • 芹菜3.1.13

随机停止工作了。 完全相同的问题 – 永远悬而未决。 重新安装Erlang或RabbitMQ没有帮助。

我也在Debian Linux 7 x86上进行测试,在这里工作正常,没有问题。

另外: https : //github.com/celery/celery/issues/2146

这是与Windows相关的问题,并设置工人标志 – 池=独奏为我固定它的ATM

.delay方法将您的任务放入队列中。 “result.state”显示挂起,这意味着你的任务尚未执行。 可能会有很多任务,所以会拖延。

检查是否有任何运行的任务

 >>> from celery.task.control import inspect >>> i = inspect() >>> i.scheduled() >>> i.active() 

您必须添加track_started=True 那么,很难知道这个选项。

默认值是False,所以当任务没有完成的时候,你总会得到PENDING。 另外,你必须在后端或代理上配置错误。 Plz再次检查他们。

 @app.task(bind=True, track_started=True) def add(x, y): return x + y