Using Celery With Flask

Posted by
on under

The topic of running background tasks is complex, and because of that there is a lot of confusion around it. I have tackled it in my Mega-Tutorial, later in my book, and then again in much more detail in my REST API training video. To keep things simple, in all the examples I have used so far I have executed background tasks in threads, but I always noted that for a more scalable and production ready solution a task queue such as Celery should be used instead.

My readers constantly ask me about Celery, and how a Flask application can use it, so today I am going to show you two examples that I hope will cover most application needs.

What is Celery?

Celery is an asynchronous task queue. You can use it to execute tasks outside of the context of your application. The general idea is that any resource consuming tasks that your application may need to run can be offloaded to the task queue, leaving your application free to respond to client requests.

Running background tasks through Celery is not as trivial as doing so in threads. But the benefits are many, as Celery has a distributed architecture that will enable your application to scale. A Celery installation has three core components:

  1. The Celery client. This is used to issue background jobs. When working with Flask, the client runs with the Flask application.
  2. The Celery workers. These are the processes that run the background jobs. Celery supports local and remote workers, so you can start with a single worker running on the same machine as the Flask server, and later add more workers as the needs of your application grow.
  3. The message broker. The client communicates with the the workers through a message queue, and Celery supports several ways to implement these queues. The most commonly used brokers are RabbitMQ and Redis.

For The Impatient

If you are the instant gratification type, and the screenshot at the top of this article intrigued you, then head over to the Github repository for the code used in this article. The README file there will give you the quick and dirty approach to running and playing with the example application.

Then come back to learn how everything works!

Working with Flask and Celery

The integration of Celery with Flask is so simple that no extension is required. A Flask application that uses Celery needs to initialize the Celery client as follows:

from flask import Flask
from celery import Celery

app = Flask(__name__)
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'

celery = Celery(, broker=app.config['CELERY_BROKER_URL'])

As you can see, Celery is initialized by creating an object of class Celery, and passing the application name and the connection URL for the message broker, which I put in app.config under key CELERY_BROKER_URL. This URL tells Celery where the broker service is running. If you run something other than Redis, or have the broker on a different machine, then you will need to change the URL accordingly.

Any additional configuration options for Celery can be passed directly from Flask's configuration through the celery.conf.update() call. The CELERY_RESULT_BACKEND option is only necessary if you need to have Celery store status and results from tasks. The first example I will show you does not require this functionality, but the second does, so it's best to have it configured from the start.

Any functions that you want to run as background tasks need to be decorated with the celery.task decorator. For example:

def my_background_task(arg1, arg2):
    # some long running task here
    return result

Then the Flask application can request the execution of this background task as follows:

task = my_background_task.delay(10, 20)

The delay() method is a shortcut to the more powerful apply_async() call. Here is the equivalent call using apply_async():

task = my_background_task.apply_async(args=[10, 20])

When using apply_async(), you can give Celery more detailed instructions about how the background task is to be executed. A useful option is to request that the task executes at some point in the future. For example, this invocation will schedule the task to run in about a minute:

task = my_background_task.apply_async(args=[10, 20], countdown=60)

The return value of delay() and apply_async() is an object that represents the task, and this object can be used to obtain status. I will show you how this is done later in this article, but for now let's keep it simple and not worry about results from tasks.

Consult the Celery documentation to learn about many other available options.

Simple Example: Sending Asynchronous Emails

The first example that I'm going to show is a very common need of applications: the ability to send emails without blocking the main application.

For this example I'm going to use the Flask-Mail extension, which I covered in very good detail in other articles. I'm going to assume that you are familiar with this extension, so if you need a refresher see this tutorial or my Flask book.

The example application that I'm going to use to illustrate the topic presents a simple web form with one text field. The user is asked to enter an email address in this field, and upon submission, the server sends a test email to this address. The form includes two submit buttons, one to send the email immediately, and another to send it after a wait of one minute. The top portion of the screenshot at the top of this article shows how this form looks.

Here is the HTML template that supports this example:

    <title>Flask + Celery Examples</title>
    <h1>Flask + Celery Examples</h1>
    <h2>Example 1: Send Asynchronous Email</h2>
    {% for message in get_flashed_messages() %}
    <p style="color: red;">{{ message }}</p>
    {% endfor %}
    <form method="POST">
      <p>Send test email to: <input type="text" name="email" value="{{ email }}"></p>
      <input type="submit" name="submit" value="Send">
      <input type="submit" name="submit" value="Send in 1 minute">

Hopefully you find nothing earth shattering here. Just a regular HTML form, plus the ability to show flashed messages from Flask.

The Flask-Mail extension requires some configuration, specifically the details about the email server to use when sending emails. To make things easy I use my Gmail account as email server:

# Flask-Mail configuration
app.config['MAIL_SERVER'] = ''
app.config['MAIL_PORT'] = 587
app.config['MAIL_USE_TLS'] = True
app.config['MAIL_USERNAME'] = os.environ.get('MAIL_USERNAME')
app.config['MAIL_PASSWORD'] = os.environ.get('MAIL_PASSWORD')
app.config['MAIL_DEFAULT_SENDER'] = ''

Note how to avoid putting my email account's credentials at risk I set them in environment variables, which I import from the application.

There is a single route to support this example:

@app.route('/', methods=['GET', 'POST'])
def index():
    if request.method == 'GET':
        return render_template('index.html', email=session.get('email', ''))
    email = request.form['email']
    session['email'] = email

    # send the email
    email_data = {
        'subject': 'Hello from Flask',
        'to': email,
        'body': 'This is a test email sent from a background Celery task.'
    if request.form['submit'] == 'Send':
        # send right away
        flash('Sending email to {0}'.format(email))
        # send in one minute
        send_async_email.apply_async(args=[email_data], countdown=60)
        flash('An email will be sent to {0} in one minute'.format(email))

    return redirect(url_for('index'))

Once again, this is all pretty standard Flask. Since this is a very simple form, I decided to handle it without the help of an extension, so I use request.method and request.form to do all the management. I save the value that the user enters in the text field in the session, so that I can remember it after the page reloads.

The data associated with the email, which is the subject, recipient(s) and body, are stored in a dictionary. The interesting bit in this route is the sending of the email, which is handled by a Celery task called send_async_email, invoked either via delay() or apply_async() with this dictionary as an argument.

The last piece of this application is the asynchronous task that gets the job done:

def send_async_email(email_data):
    """Background task to send an email with Flask-Mail."""
    msg = Message(email_data['subject'],
    msg.body = email_data['body']
    with app.app_context():

This task is decorated with celery.task to make it a background job. The function constructs a Message object from Flask-Mail using the data from the email_data dictionary. One notable thing in this function is that Flask-Mail requires an application context to run, so one needs to be created before the send() method can be invoked.

It is important to note that in this example the return value from the asynchronous call is not preserved, so the application will never know if the call succeeded or not. When you get to run this example, you can look at the output of the Celery worker to troubleshoot any problems with the sending of the email.

Complex Example: Showing Status Updates and Results

The above example is overly simple, the background job is started and then the application forgets about it. Most Celery tutorials for web development end right there, but the fact is that for many applications it is necessary for the application to monitor its background tasks and obtain results from it.

What I'm going to do now is extend the above application with a second example that shows a fictitious long running task. The user can start one or more of these long running jobs clicking a button, and the web page running in your browser uses ajax to poll the server for status updates on all these tasks. For each task the page will show a graphical status bar, a completion percentage, a status message, and when the task completes, a result value will be shown as well. You can see how all this looks in the screenshot at the top of this article.

Background Tasks with Status Updates

Let me start by showing you the background task that I'm using for this second example:

def long_task(self):
    """Background task that runs a long function with progress reports."""
    verb = ['Starting up', 'Booting', 'Repairing', 'Loading', 'Checking']
    adjective = ['master', 'radiant', 'silent', 'harmonic', 'fast']
    noun = ['solar array', 'particle reshaper', 'cosmic ray', 'orbiter', 'bit']
    message = ''
    total = random.randint(10, 50)
    for i in range(total):
        if not message or random.random() < 0.25:
            message = '{0} {1} {2}...'.format(random.choice(verb),
                          meta={'current': i, 'total': total,
                                'status': message})
    return {'current': 100, 'total': 100, 'status': 'Task completed!',
            'result': 42}

For this task I've added a bind=True argument in the Celery decorator. This instructs Celery to send a self argument to my function, which I can then use to record the status updates.

Since this task doesn't really do anything useful, I decided to use humorous status messages that are assembled from random verbs, adjectives and nouns. You can see the lists of non-sensical items I use to generate these messages above. Nothing wrong with having a little bit of fun, right?

The function loops for a random number of iterations between 10 and 50, so each run of the task will have a different duration. The random status message is generated on the first iteration, and then can be replaced in later iterations with a 25% chance.

The self.update_state() call is how Celery receives these task updates. There are a number of built-in states, such as STARTED, SUCCESS and so on, but Celery allows custom states as well. Here I'm using a custom state that I called PROGRESS. Attached to the state there is additional metadata, in the form of a Python dictionary that includes the current and total number of iterations and the randomly generated status message. A client can use these elements to display a nice progress bar. Each iteration sleeps for one second, to simulate some work being done.

When the loop exits, a Python dictionary is returned as the function's result. This dictionary includes the updated iteration counters, a final status message and a humorous result.

The long_task() function above runs in a Celery worker process. Below you can see the Flask application route that starts this background job:

@app.route('/longtask', methods=['POST'])
def longtask():
    task = long_task.apply_async()
    return jsonify({}), 202, {'Location': url_for('taskstatus',

As you can see the client needs to issue a POST request to /longtask to kick off one of these tasks. The server starts the task, and stores the return value. For the response I used status code 202, which is normally used in REST APIs to indicate that a request is in progress. I also added a Location header, with a URL that the client can use to obtain status information. This URL points to another Flask route called taskstatus, and has as a dynamic component.

Accessing Task Status from the Flask Application

The taskstatus route referenced above is in charge of reporting status updates provided by background tasks. Here is the implementation of this route:

def taskstatus(task_id):
    task = long_task.AsyncResult(task_id)
    if task.state == 'PENDING':
        # job did not start yet
        response = {
            'state': task.state,
            'current': 0,
            'total': 1,
            'status': 'Pending...'
    elif task.state != 'FAILURE':
        response = {
            'state': task.state,
            'current':'current', 0),
            'total':'total', 1),
            'status':'status', '')
        if 'result' in
            response['result'] =['result']
        # something went wrong in the background job
        response = {
            'state': task.state,
            'current': 1,
            'total': 1,
            'status': str(,  # this is the exception raised
    return jsonify(response)

This route generates a JSON response that includes the task state and all the values that I set in the update_state() call as the meta argument, which the client can use to build a progress bar. Unfortunately this function needs to check for a few edge conditions as well, so it ended up being a bit long. To access task data I recreate the task object, which is an instance of class AsyncResult, using the task id given in the URL.

The first if block is for when the task hasn't started yet (PENDING state). In this case there is no status information, so I make up some data. The elif block that follows is that one that returns the status information from the background task. Here the information that the task provided is accessible as If the data contains a result key, then that means that this is the final result and the task finished, so I add that result to the response as well. The else block at the end covers the possibility of an error, which Celery will report by setting a task state of "FAILURE", and in that case will contain the exception raised. To handle errors I set the text of the exception as a status message.

Believe it or not, this is all it takes from the server. The rest needs to be implemented by the client, which in this example is a web page with Javascript scripting.

Client-Side Javascript

It isn't really the focus of this article to describe the Javascript portion of this example, but in case you are interested, here is some information.

For the graphical progress bar I'm using nanobar.js, which I included from a CDN. I also included jQuery, which simplifies the ajax calls significantly:

<script src="//"></script>
<script src="//"></script>

The button that starts a background job is connected to the following Javascript handler:

    function start_long_task() {
        // add task status elements 
        div = $('<div class="progress"><div></div><div>0%</div><div>...</div><div>&nbsp;</div></div><hr>');

        // create a progress bar
        var nanobar = new Nanobar({
            bg: '#44f',
            target: div[0].childNodes[0]

        // send ajax POST request to start background job
            type: 'POST',
            url: '/longtask',
            success: function(data, status, request) {
                status_url = request.getResponseHeader('Location');
                update_progress(status_url, nanobar, div[0]);
            error: function() {
                alert('Unexpected error');

This function starts by adding a few HTML elements that will be used to display the new background task's progress bar and status. This is done dynamically because the user can add any number of jobs, and each job needs to get its own set of HTML elements.

To help you understand this better, here is the structure of the added elements for a task, with comments to indicate what each div is used for:

<div class="progress">
    <div></div>         <-- Progress bar
    <div>0%</div>       <-- Percentage
    <div>...</div>      <-- Status message
    <div>&nbsp;</div>   <-- Result

The start_long_task() function then instantiates the progress bar according to nanobar's documentation, and finally sends the ajax POST request to /longtask to initiate the Celery background job in the server.

When the POST ajax call returns, the callback function obtains the value of the Location header, which as you saw in the previous section is for the client to invoke to get status updates. It then calls another function, update_progress() with this status URL, the progress bar object and the root div element subtree created for the task. Below you can see this update_progress() function, which sends the status request and then updates the UI elements with the information returned by it:

    function update_progress(status_url, nanobar, status_div) {
        // send GET request to status URL
        $.getJSON(status_url, function(data) {
            // update UI
            percent = parseInt(data['current'] * 100 / data['total']);
            $(status_div.childNodes[1]).text(percent + '%');
            if (data['state'] != 'PENDING' && data['state'] != 'PROGRESS') {
                if ('result' in data) {
                    // show result
                    $(status_div.childNodes[3]).text('Result: ' + data['result']);
                else {
                    // something unexpected happened
                    $(status_div.childNodes[3]).text('Result: ' + data['state']);
            else {
                // rerun in 2 seconds
                setTimeout(function() {
                    update_progress(status_url, nanobar, status_div);
                }, 2000);

This function sends the GET request to the status URL, and when a response is received it updates the different HTML elements for the task. If the background task completed and a result is available then it is added to the page. If there is no result then that means that the task ended due to an error, so the task state, which is going to be FAILURE, is shown as result.

When the server is still running the job I need to continue polling the task status and updating the UI. To achieve this I set a timer to call the function again in two seconds. This will continue until the Celery task completes.

A Celery worker runs as many concurrent jobs as there are CPUs by default, so when you play with this example make sure you start a large number of tasks to see how Celery keeps jobs in PENDING state until the worker can take it.

Running the Examples

If you made it all the way here without running the example application, then it is now time for you to try all this Celery goodness. Go ahead and clone the Github repository, create a virtual environment, and populate it:

$ git clone
$ cd flask-celery-example
$ virtualenv venv
$ source venv/bin/activate
(venv) $ pip install -r requirements.txt

Note that the requirements.txt file included with this repository contains Flask, Flask-Mail, Celery and the Redis client, along with all their dependencies.

Now you need to run the three processes required by this application, so the easiest way is to open three terminal windows. On the first terminal run Redis. You can just install Redis according to the download instructions for your operating system, but if you are on a Linux or OS X machine, I have included a small script that downloads, compiles and runs Redis as a private server:

$ ./

Note that for the above script to work you need to have gcc installed. Also note that the above command is blocking, Redis will start in the foreground.

On the second terminal run a Celery worker. This is done with the celery command, which is installed in your virtual environment. Since this is the process that will be sending out emails, the MAIL_USERNAME and MAIL_PASSWORD environment variables must be set to a valid Gmail account before starting the worker:

$ export MAIL_USERNAME=<your-gmail-username>
$ export MAIL_PASSWORD=<your-gmail-password>
$ source venv/bin/activate
(venv) $ celery worker -A app.celery --loglevel=info

The -A option gives Celery the application module and the Celery instance, and --loglevel=info makes the logging more verbose, which can sometimes be useful in diagnosing problems.

Finally, on the third terminal window run the Flask application, also from the virtual environment:

$ source venv/bin/activate
(venv) $ python

Now you can navigate to http://localhost:5000/ in your web browser and try the examples!


Unfortunately when working with Celery you have to take a few more steps than simply sending a job to a background thread, but the benefits in flexibility and scalability are hard to ignore. In this article I tried to go beyond the "let's start a background job" example and give you a more complete and realistic portrait of what using Celery might entail. I sincerely hope I haven't scared you with too much information!

As always, feel free to write down any questions or comments below.


Become a Patron!

Hello, and thank you for visiting my blog! If you enjoyed this article, please consider supporting my work on this blog on Patreon!

  • #101 Miguel Grinberg said

    @Houman: I have shown how to integrate Celery and SQLAlchemy on a few of my projects. Take a look at these two for examples:

  • #102 Huynh said

    Hi Miguel,

    Thanks for your great instruction. When initiating Celery and Flask, this following line has been run twice: (1) when Celery process starts, and (2) when Flask application starts

    <h1>Initialize extensions</h1>

    mail = Mail(app)

    I have adopted this model on my application, it required a heavy loading process during the initiation (10GB for run). Is it possible to
    run both Celery and Flask with only once executing the "Initialize extensions".

  • #103 Miguel Grinberg said

    @Huynh: not sure I understand. The Flask server is a process, and then Celery runs one or more server processes. These are all independent entities, they cannot share data. If you don't need a Flask application instance on the Celery workers, then don't create one when Celery starts.

  • #104 kirk said

    nice demo. have you any new insight into this task processing system do people still use this type of setup. ?

  • #105 Jack Mullen said

    Hello Miquel:

    I hope this finds you in a happy place today .. :)

    I am trying to use threaded tasks, several, while using flask and Flask-SocketIO. I have had much success using the socketio module inside the main context of flask -- but much difficulty using threads. My many-hour problem right now is I start a task running in the background ( I have tried all the thread methods you suggest) which completes and using a passed-into-thread function -- calls back to flask with some data -- and that Method (def) then packages the data correctly and attempts to use socketio to notify the browser ..

    Ie., my thread (this version) starts like this registrationThread = regi.registration(mycallback_method)

    The call back works fine but as soon as the callback fires socketio.emit("my response", data) -- all communications between the client ( browser and the flask server is ended -- no error messages -- the server is still responsive to normal app.route() requests but no longer will socketio rec or emit data.

    I have tried to eliminate sending the returned data and just sending a string back -- but still locks. The issue is this : As soon as something from the finished thread attempts to call a function that uses socketio -- it crashes -- It must be a context thing but I have struggled for a long time on it -- Do you have any ideas --

    And thank you for responding if you can.

  • #106 Miguel Grinberg said

    @kirk: not sure what you mean with your question. The techniques I presented in this article still work today.

  • #107 Miguel Grinberg said

    @Jack: have you seen my flack example on Github? ( That shows how I emit Socket.IO events from a Celery task. Maybe that'll help you structure your code.

  • #108 Leon said

    Hi Miguel. Thanks for this tutorial. My question is that what if I want to separate the producer(web server) and consumer(remote server sending out emails). It seem like that both sides should have code related to celery part, so web server can call the task and celery worker can run on email server. Redis or Rabbitmq can run on either side.

  • #109 Miguel Grinberg said

    @Leon: what do you mean exactly by "separate"? If you organize your project like I did for this tutorial, the both the consumer and the producer have access to the complete application, so your server must be installed on both sides (the server will only be running on the producer side, the consumer will just instantiate a Flask app instance but will not run it as a server). The Celery package also needs to be installed on both sides. The message queue can be anywhere you like, it does not need to be with either of these two, all you need to do is point producer and consumer at it.

    If by separate you meant separate source repositories, that requires a careful sharing of code. The producer will be the Flask app, with a dependency on Celery (which will be requirements.txt). The consumer will be a separate repo that contains all the tasks, also has a dependency on Celery, and has a dependency on the Flask app, since the tasks will need a Flask app instance to run.

  • #110 matchi said

    @Miguel Thank you so much for your tutorial.
    I get this error when the page loads:
    OperationalError: Error 61 connecting to localhost:6379. Connection refused.
    Could you please advise me on how I can fix the error?

    Thanks :)

  • #111 Miguel Grinberg said

    @matchi: did you install Redis on your machine? You have to have Redis running on port 6379 for the server to communicate with Celery.

  • #112 Matt Healy said

    Hi Miguel

    Just a little note, newer versions of Celery (version 4 onwards) don't use pickle as the default serializer.

    Due to this, your example would need to be re-worked to construct the Flask-Mail.Message object inside the celery task, as it can't be passed as an argument because it is not JSON-serializable. This tripped me up when I updated my celery library to version 4.


  • #113 Peter said

    Hi Miguel,
    I really like your book and use flask a lot these days.
    This example is very helpful and I used it as a basis in my app, which follows the large application structure in your book. However, I can't manage to set celery up to work there. Can you comment on where to import what in the large application structure to make this project work?


  • #114 Miguel Grinberg said

    @Peter: Have you seen this project: That's the Flasky app from the book, modified to use Celery workers to send emails. Also look at the issue reported in the repository, as that has to do with a change in recent Celery versions regarding serialization of data between the Celery client and the workers.

  • #115 Devraj said

    Hi Miguel,

    Thanks for the tutorial. My question is that how do I setup remote Celery workers on different Linux containers?


  • #116 Miguel Grinberg said

    @Devraj: there is really no difference, you just need to make sure the two container images for the server and the celery workers have the same code and are configured with the same message queue.

  • #117 Toan Nguyen said

    Thank you Miguel for another great tutorial! I followed your tutorial and managed to make it work in my project, which runs Flask + MongoDB + RabbitMQ on Ubuntu. I deploy the app to be accessed via Apache + mod_wsgi. It works well. I just have one question: is it possible to start the celery worker automatically in the wsgi script just like the way the virtual environment is activated?

    My wsgi script is as follows:

    import sys
    sys.path.insert(0, '/var/www/dck/')
    activate_this = '/home/xxx/.virtualenvs/dck/bin/'
    execfile(activate_this, dict(file=activate_this))
    from apps import app as application

    <h1>celery -A apps.celery worker</h1>

    Thank you.

  • #118 Miguel Grinberg said

    @Toan: You can do that, but I'm not sure that is very convenient. It seems to me that restarting the Celery workers when you do an upgrade will be difficult when the service is tied to the virtual environment. Have you looked at supervisord? That is what I use to start/stop/restart service.

  • #119 Zakaria said

    thanx for this tutorial,

    Could we start the worker from another host, in this case the flask app context will be available for the worker ? for example to do some database interactions

  • #120 Miguel Grinberg said

    @Zakaria: Yes, you can run the Celery workers on a different host, or even multiple hosts.

  • #121 Lakshmi Narayan(Nani) said

    Great article. Thank you

  • #122 Gokhan said

    Miguel thank you for the tutorial. You are doing a marvelous job by touching every useful details and explaining them with patience.

    I want to ask about using celery.task decorator in a class method. I tried to do this and call that method with its class but got error "'AsyncResult' object is not callable".

    Does that mean it is not possible to use celery decorators with class methods? And should someone do to use asynchronous process with Flask while using design patterns(which requires to use classes).

    My case is below:
    class MyClass:
    def myFunc(self):
    return anotherFunc(a_variable)

    @app.route('/', methods=['POST'])
    def lonelyFunc():
    return MyClass.myFunc.delay()

  • #123 Miguel Grinberg said

    @Gokhan: The problem is likely that Celery has no idea what is the value of "self". I think this is probably going to work if you make your method a static method using the "@staticmethod` decorator, which removes the self argument.

  • #124 Suman said

    [2017-08-07 13:32:13,192: ERROR/MainProcess] Task app.send_async_email[f0bcde22-d18d-4cbd-a1f5-a23ffbb1640a] raised unexpected: AttributeError("'Flask' object has no attribute 'app_context'",)
    Traceback (most recent call last):
    File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 240, in trace_task
    R = retval = fun(args, kwargs)
    File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 438, in protected_call
    args, **kwargs)
    File "/home/suman/workspace/flask-celery-example/", line 37, in send_async_email
    with app.app_context():
    AttributeError: 'Flask' object has no attribute 'app_context'

  • #125 Miguel Grinberg said

    @Suman: Can't really say exactly what's going on, but the error is weird. It seems your Flask application instance is corrupted somehow. Do you have the complete project on a place I can see it?

Leave a Comment