Image

Jacob Perkins: Async Python Functions with Celery

Celeryis a great tool for scheduled function execution in python. You can also use it for running functions in the background asynchronously from your main process. However, it does not supportpython asyncio. This is a big limitation, becauseasyncfunctions are usually much more I/O efficient, and there are many libraries that provide great async support. Andparallel data processing withasync.gatherbecomes impossible in celery without async support.

Celery Async Issues

Unfortunately, based on the currentOpenstatus of these issues, celery will not supportasyncfunctions anytime soon.

But luckily there are two projects that provide async celery support.

AIO Celery

This project is an alternative independent asyncio implementation of Celery

aio-celery“does not depend on the celery codebase”. Instead, it provides a new implementation of the Celery Message Protocol that enables asyncio tasks and workers.

It is written completely from scratch as a thin wrapper around aio-pika (which is an asynchronous RabbitMQ python driver) and it has no other dependencies

It is actively developed, and seems like a great celery alternative. But there are some downsides:

  1. “OnlyRabbitMQas a message broker” means you cannot use any other broker such asRedis
  2. “Only Redis as a result backend” means you can’t store results in any other database
  3. “Complete feature parity with upstream Celery project is not the goal”, so there may be features from celery you want that are not present in aio-celery

Celery AIO Pool

celery-aio-poolprovides a custom worker pool implementation that works with celery 5.3+. Unlike aio-celery, you can keep using your existing celery implementation. All you have to do to getasynctask support in celery is:

  1. Start your celery worker with this environment variable:CELERY_CUSTOM_WORKER_POOL='celery_aio_pool.pool:AsyncIOPool'
  2. Run the celery worker process with--pool=custom

So your worker command will look like

CELERY_CUSTOM_WORKER_POOL='celery_aio_pool.pool:AsyncIOPool' celery worker --pool=custom

plus whatever other arguments or environment variables you need. Once you have this in place, you can start usingasyncfunctions as celery tasks.

Whilecelery-aio-poolis not as actively developed, it works, and has the following benefits:

  • Simple to install and configure with Celery >= 5.3
  • Works with any celery support message broker or result backend
  • Works with your existing celery setup without requiring any other changes

https://streamhacker.com/2025/09/22/async-python-functions-with-celery/