python - Celery workers not consuming enough tasks -


i have strange issue celery.

i using rabbitmq message broker , result backend.

tasks serialized via pickle, id file in database. fetch it, work on , write result database. i'm storing id in result backend.

i use group supply tasks , don't run subtasks within it.

i have 1 worker concurrency=8 (prefork)

if start task, 8 processes working (100% cpu usage).

after first task finishes, strange behavior begins. process not begin new task. task get's initialized (i used celeryd_max_tasks_per_child=1) run method doesn't called.

so problem is, not processes working time.

tried many configuration settings nothing changed behavior.

do have idea?

it's not database etc. running message broker , database locally. had on workers flower, says of time round 4 processes active. other tasks reserved, don't start.

hope u can me!

finally figured out:

it's option had put starting worker.

starting worker -ofair option did it!

see: http://docs.celeryproject.org/en/latest/userguide/optimizing.html#prefork-pool-prefetch-settings

thanks :)


Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

ios - Change Storyboard View using Seague -