How to scale celery to execute many tasks in parallel?

Published 14 Apr, 2022

Reading time: 2 Mins

Celery worker can not execute many tasks at the time. But with this simple configuration you can run many task in parallel.

I never had the situation where I need to deal with scalability of my projects. But, that changed when my project hits the Hacker News front-page.

There is huge spike in the server loads and people in the HN posts complaining the app its not working. I sit on my laptop and try to figure out what’s going on. Few minutes later I found that my celery is not working as expected.

When I tested my application in my laptop I use my project as the single person. But, I never imagine more people searching for the domain name in the same time. By default when you setup the celery you configured it for worker. And that’s great for the local development but not great for the production one. Let’s assume that you have following thing configured in your celery

celery -A getsocialdomain worker

Now whenever user searches in my project the single worker will spawn and execute the task. But if more people searches then it will move into queue. So in the end the each user query execute one by one by the celery.

But how do we run many workers as possible? Well that’s where the autoscale option comes into play. You mention how many worker need to be spawn when more task in queue. This will auto handle the worker start and destroy when things are not needed.

celery -A getsocialdomain worker --autoscale=10,0

Now the above configuration increase the celery worker to 10 when it’s in peak and down to 0 when no user.

Also keep in mind the worker are a thing with memory so keep an eye on that one.

This article published under development on django tags. If you wish to receive email from me when I post a new blog post then please subscribe to my newsletter.

You might also like