Sanic + Celery + aiosmtplib

Hello, guys. I’ve been using sanic for a while now, along with aiosmtplib as my SMTP server, however i’m having a bug with it. When a task in the queue fails, it does not send any of the upcoming tasks and i have to restart it.

I wanted to try celery as a worker for rabbitMQ and get to work a new way to handle my mails queue, but i think i’m missing some fundamentals principles on Queue handling. Do i have to initialize my Celery worker along with sanic app or it has to be a remote server just for that? How can i queue new tasks (that these will be only emails) on a remote Celery worker?

Thanks!

Hi @rgarcia. Happy to help here, as I have used these two solutions together in production.

You would have separate machines running these services. Each is a process that would be connected to a common broker (rabbit, redis, etc). You would need one Celery instance that has all of the task code.

There are a couple options for Sanic, but I would suggest using Celery.send_task to send messages. This makes it so that you do not need to have your code sitting on two machines, and managing keeping them in sync. send_task is a method that just dispatches the message for Celery to pick up and execute.

I hope this helps to get you started. Happy to help further if needed.

1 Like

Thanks!!

This worked perfectly, the configuration of Celery it’s inside my project and not as a separate folder. I didn’t even had to use the send_task method instead used .apply and it worked too.