In this tutorial we will set up Celery in our Django project in a few steps and make it run as a Daemon in the background so that we can send it tasks to execute asynchronously.
Installing Celery
pip install celery
Add some Celery configuration in your project
In this example we will be using Redis as the backend so make sure You install Redis via apt
if You are using Ubuntu.
sudo apt-get install redis-server
repo/projectile/projectile/settings.py
# CELERY SETTINGS
CELERY_BACKEND = 'redis://localhost:6379/3'
CELERY_BROKER_URL = 'redis://localhost:6379/4'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/5'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_ENABLE_UTC = True
Add a simple task in Your project so that we can test that Celery is working as expected
repo/projectile/core/tasks.py
Pythonfrom __future__ import absolute_import
import logging
logger = logging.getLogger(__name__)
from celery import shared_task
@shared_task
def add(x, y):
# Only for testing...
return x + y
Add the celery.py
file and edit __init__.py
in order to properly wire up Celery
repo/projectile/projectile/celery.py
Pythonfrom __future__ import absolute_import, unicode_literals
import os
import dotenv
from celery import Celery
# Load .env variables
dotenv.read_dotenv()
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'projectile.settings')
app = Celery('projectile')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
repo/projectile/projectile/__init__.py
Pythonfrom __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Create a new user named celery
sudo useradd celery -d /home/celery -b /bin/bash
Create the necessary pid and log folders and set the right permissions
Use systemd-tmpfiles
in order to create working directories (for logs and pid). Put the two lines below in /etc/tmpfiles.d/celery.conf
d /var/run/celery 0755 celery celery -
d /var/log/celery 0755 celery celery -
❕ NOTE: The step above is really important. If the celery user does not have the right access permissions it will fail to start.
You can also do it manually but I prefer the example above.
sudo mkdir /var/log/celery
sudo chown -R celery:celery /var/log/celery
sudo chmod -R 755 /var/log/celery
sudo mkdir /var/run/celery
sudo chown -R celery:celery /var/run/celery
sudo chmod -R 755 /var/run/celery
Daemonize celery
You do no not need to do this in Your development environment. You can just run celery -A projectile worker --loglevel=DEBUG
after running cd repo/projectile/
Create the /etc/conf.d/celery
Python# Name of nodes to start
# here we have a single node
CELERYD_NODES="w1"
# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Absolute or relative path to the 'celery' command:
CELERY_BIN="/home/django/env/bin/celery"
# App instance to use
# comment out this line if you don't use an app
CELERY_APP="projectile"
# How to call manage.py
CELERYD_MULTI="multi"
# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"
# - %n will be replaced with the first part of the nodename.
# - %I will be replaced with the current child process index
# and is important when using the prefork pool to avoid race conditions.
CELERYD_PID_FILE="/var/run/celery/%n.pid"
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_LOG_LEVEL="INFO"
Create /etc/systemd/system/celery.service
[Unit]
Description=Celery Service
After=network.target
[Service]
Type=forking
User=celery
Group=celery
EnvironmentFile=/etc/conf.d/celery
WorkingDirectory=/home/django/project/projectile
ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES} \
--pidfile=${CELERYD_PID_FILE}'
ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
[Install]
WantedBy=multi-user.target
Make sure You run the command below after creating /etc/conf.d/celery
and /etc/systemd/system/celery.service
sudo systemctl daemon-reload
❕ NOTE: If you change /etc/conf.d/celery
and /etc/systemd/system/celery.service
You need to run sudo systemctl daemon-reload
Fire up Celery via systemd
sudo systemctl start celery
and then run
sudo systemctl status celery
You should get something like below
faisal@example-com:~$ sudo systemctl status celery
● celery.service - Celery Service
Loaded: loaded (/etc/systemd/system/celery.service; disabled; vendor preset: enabled)
Active: active (running) since Mon 2019-02-18 18:42:43 CET; 2 days ago
Main PID: 4852 (python)
CGroup: /system.slice/celery.service
├─4852 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4856 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4857 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4858 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4859 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4860 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4861 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
├─4862 /home/django/env/bin/python -m celery worker --time-limit=300 -A projectile --concurrency=8 --loglevel=DEBUG --logfile=/var/log/celery/w1%I.log --pidfile=/var/run/celery/w1.pid --hostname=w1@example-com
That's it. Happy coding!