pycharm的Terminal中执行worker启动命令:
D:\pythonFiles\test_celery> celery -A celery_app worker --loglevel=info
报错:
-------------- celery@DESKTOP-0R048UM v4.4.2 (cliffs) --- ***** ----- -- ******* ---- Windows-10-10.0.18362-SP0 2020-04-07 14:26:56 - *** --- * --- - ** ---------- [config] - ** ---------- .> app: mails:0x18c481036a0 - ** ---------- .> transport: redis://127.0.0.1:6379// - ** ---------- .> results: redis://127.0.0.1:6379/0 - *** --- * --- .> concurrency: 8 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . celery_app.mail_test.send_mail [2020-04-07 14:26:56,261: INFO/MainProcess] Connected to redis://127.0.0.1:6379// [2020-04-07 14:26:56,276: INFO/MainProcess] mingle: searching for neighbors [2020-04-07 14:26:56,915: INFO/SpawnPoolWorker-1] child process 8840 calling self.run() [2020-04-07 14:26:56,920: INFO/SpawnPoolWorker-2] child process 17180 calling self.run() [2020-04-07 14:26:56,931: INFO/SpawnPoolWorker-3] child process 5680 calling self.run() [2020-04-07 14:26:56,941: INFO/SpawnPoolWorker-5] child process 18680 calling self.run() [2020-04-07 14:26:56,944: INFO/SpawnPoolWorker-7] child process 21336 calling self.run() [2020-04-07 14:26:56,949: INFO/SpawnPoolWorker-6] child process 3484 calling self.run() [2020-04-07 14:26:56,960: INFO/SpawnPoolWorker-4] child process 13028 calling self.run() [2020-04-07 14:26:56,972: INFO/SpawnPoolWorker-8] child process 3128 calling self.run() [2020-04-07 14:26:57,313: INFO/MainProcess] mingle: all alone [2020-04-07 14:26:57,362: INFO/MainProcess] celery@DESKTOP-0R048UM ready. [2020-04-07 14:27:03,003: INFO/MainProcess] Received task: celery_app.mail_test.send_mail[3f2aff1a-0555-497e-aaef-a28c184ec1fe] [2020-04-07 14:27:03,007: ERROR/MainProcess] Task handler raised error: ValueError(‘not enough values to unpack (expected 3, got 0)‘,) Traceback (most recent call last): File "d:\python3.6.4\lib\site-packages\billiard\pool.py", line 362, in workloop result = (True, prepare_result(fun(*args, **kwargs))) File "d:\python3.6.4\lib\site-packages\celery\app\trace.py", line 546, in _fast_trace_task tasks, accept, hostname = _loc ValueError: not enough values to unpack (expected 3, got 0)
网上找了一些之后,发现:https://www.cnblogs.com/Hannibal-2018/p/11147224.html ,参照这个大兄弟的解决方法------安装eventlet
pip install eventlet
再次执行任务则没有报错(需要加上 -P eventlet参数):
D:\pythonFiles\test_celery> celery -A celery_app worker --loglevel=info -P eventlet
-------------- celery@DESKTOP-0R048UM v4.4.2 (cliffs) --- ***** ----- -- ******* ---- Windows-10-10.0.18362-SP0 2020-04-07 14:28:41 - *** --- * --- - ** ---------- [config] - ** ---------- .> app: mails:0x1e9cb8556a0 - ** ---------- .> transport: redis://127.0.0.1:6379// - ** ---------- .> results: redis://127.0.0.1:6379/0 - *** --- * --- .> concurrency: 8 (eventlet) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . celery_app.mail_test.send_mail [2020-04-07 14:28:41,149: INFO/MainProcess] Connected to redis://127.0.0.1:6379// [2020-04-07 14:28:41,167: INFO/MainProcess] mingle: searching for neighbors [2020-04-07 14:28:42,189: INFO/MainProcess] mingle: all alone [2020-04-07 14:28:42,204: INFO/MainProcess] pidbox: Connected to redis://127.0.0.1:6379//. [2020-04-07 14:28:42,207: INFO/MainProcess] celery@DESKTOP-0R048UM ready. [2020-04-07 14:28:47,919: INFO/MainProcess] Received task: celery_app.mail_test.send_mail[9c1fcca3-04ec-4d34-b9e6-4c3026283d79]
参考:https://www.cnblogs.com/Hannibal-2018/p/11147224.html
Celery ValueError: not enough values to unpack (expected 3, got 0)的解决方案
原文:https://www.cnblogs.com/aidenzdly/p/12653302.html