HomeBlog

Managing Background Tasks in FastAPI: BackgroundTasks vs ARQ + Redis

6 min read
An illustration showing a FastAPI application sending tasks to an ARQ worker queue backed by Redis.
By David Muraya • August 18, 2025

When you build a web API, some tasks take too long to run inside a single request. Things like sending emails, processing images, or calling slow external services can time out or make your application feel unresponsive. The solution is to offload these jobs to a background worker.

This article walks through a FastAPI project on GitHub that uses ARQ and Redis to manage background tasks. It provides a solid, production-ready template for handling asynchronous jobs in your own applications.

Choosing the Right Tool for Background Tasks

When it comes to background tasks in FastAPI, you have a few options. It's important to understand the trade-offs.

FastAPI's Built-in BackgroundTasks

First, FastAPI has a built-in BackgroundTasks feature. It's very simple to use and is great for "fire-and-forget" tasks that don't need to be tracked, like sending a notification email after a user signs up.

However, the built-in solution has significant limitations for more complex applications:

  • No Status Tracking: You can't check if a task has started, is running, or has completed.
  • No Result Retrieval: There is no way to get the return value of a task.
  • Tied to the Web Process: The tasks run in the same process as your web server. A CPU-intensive task could slow down your entire API, and if the server crashes, your tasks are lost.

Why Use a Dedicated Queue like ARQ?

For anything beyond simple, non-critical tasks, you need a dedicated task queue. This is where tools like ARQ or Celery come in. They solve all the limitations of the built-in approach by running tasks in separate worker processes.

Many Python developers reach for Celery for background tasks. But this project uses ARQ for a simple reason: it's built for asyncio.

FastAPI is an async framework. Your API routes are async def functions. ARQ is designed to work naturally with this structure. Task functions in ARQ are also async def, so you don't need extra libraries or workarounds to make them work together. Celery, on the other hand, was originally designed for synchronous code, and using it with async functions requires more setup. For a modern FastAPI application, ARQ is a better, simpler and more direct fit.

Quick Guide: When to Use Which?

Here’s a simple breakdown to help you decide:

Use FastAPI's BackgroundTasks when:

  • The task is fast and lightweight (a few seconds at most).
  • Failure isn't critical (e.g., sending an analytics ping or a "welcome" email that can be retried later if it fails).
  • You just need to "fire and forget" without needing to know the result.

Use a Job Queue (like ARQ) when:

  • The task is long-running or CPU-heavy (e.g., video processing, report generation).
  • You need reliability with retries, scheduling, or persistence.
  • You need to scale your background workers independently of your web server.
  • Monitoring, debugging, or auditing the status of tasks is important.

How the Project Works

The project is a complete example of how to connect FastAPI, ARQ, and Redis. It's not just a "hello world" example; it includes features you'd need in a real application.

Here’s what it does:

  • Enqueue Jobs via API: You can send a request to a FastAPI endpoint, which then puts a job onto a Redis queue. The API immediately returns a job ID, so the client isn't left waiting.
  • Process Jobs with a Worker: A separate ARQ worker process listens to the Redis queue, picks up jobs, and executes them in the background.
  • Check Job Status: You can use the job ID to check the status of a task through another API endpoint.
  • Persist Job History: Once a job is complete, its details (like status and result) are saved to a SQLite database for auditing and long-term tracking.

Setting It Up

Getting the project running is straightforward. First, you need to have Redis installed and running.

Then, clone the repository and install the dependencies.

git clone https://github.com/davidmuraya/fastapi-arq.git
cd fastapi-arq
pip install -r requirements.txt

Next, create a .env file in the root directory to configure the Redis connection, queue name, and database file.

REDIS_BROKER=localhost:6379
WORKER_QUEUE=my-fastapi-queue
JOBS_DB=database/jobs.db

Now you can run the application. You'll need two separate terminals.

In the first terminal, start the FastAPI server:

uvicorn main:app --reload --port 5000

In the second terminal, start the ARQ worker:

arq worker:WorkerSettings

The worker will automatically connect to Redis and start listening for jobs on the queue you defined.

A Quick Example

Once everything is running, you can enqueue a job. The project includes a simple add task that adds two numbers.

curl -X POST "http://localhost:5000/tasks/add" \
-H "Content-Type: application/json" \
-d '{"x": 5, "y": 10}'

The API will respond immediately with a JSON object containing the job_id.

{
  "job_id": "your-unique-job-id"
}

You can then use this ID to check the job's status.

curl "http://localhost:5000/jobs/your-unique-job-id"

The response will tell you if the job is complete and show you the result.

{
  "id": "your-unique-job-id",
  "status": "complete",
  "result": 15,
  "start_time": "2025-08-18T12:00:00.000Z",
  "finish_time": "2025-08-18T12:00:01.000Z"
}

The project also includes examples for more complex tasks, like making a retriable HTTP request or scheduling a job to run at a specific time.

Final Thoughts

Handling background tasks is a common requirement for web applications. This project provides a clear and practical blueprint for doing it in FastAPI with ARQ. It shows how to separate your API from your workers, manage job state, and build a reliable system for handling long-running processes. If you're looking for a straightforward way to add background jobs to your async Python project, give ARQ a try.


FAQ

Q: Why not just use FastAPI's built-in BackgroundTasks? A: FastAPI's BackgroundTasks is great for simple, "fire-and-forget" tasks. However, it doesn't allow you to check a job's status, retrieve its result, or run tasks in a separate process. For reliable, scalable applications where you need to monitor jobs, handle failures, and avoid blocking the web server, a dedicated queue system like ARQ is a much better approach.

Q: What is the difference between ARQ and Celery? A: The main difference is that ARQ is designed for asyncio from the ground up, making it a natural fit for async frameworks like FastAPI. Celery is older and was built for synchronous code, so using it with async functions requires extra configuration.

Q: Do I need Redis to use ARQ? A: Yes, ARQ uses Redis as its message broker to manage the queue of tasks between your web application and the workers. You will need a running Redis instance.

Q: How does the worker know what code to run? A: The ARQ worker is configured to import your task functions (from tasks.py in this project). When it receives a job from the queue, it knows which function to call and what arguments to pass to it.

Q: Can I schedule a task to run in the future? A: Yes. The project includes an example of a scheduled task using ARQ's defer_until feature, which allows you to enqueue a job that will only be executed after a specific datetime.

Q: Is this setup ready for production? A: The project provides a solid foundation for a production setup. For a real-world deployment, you would run the FastAPI server and the ARQ worker as separate services (e.g., in different Docker containers) and use a production-grade Redis server.

About the Author

David Muraya is a Solutions Architect specializing in Python, FastAPI, and Cloud Infrastructure. He is passionate about building scalable, production-ready applications and sharing his knowledge with the developer community. You can connect with him on LinkedIn.

Related Blog Posts

Enjoyed this blog post? Check out these related posts!

How to Set Up a Custom Domain for Your Google Cloud Run service
Optimizing Reflex Performance on Google Cloud Run

Optimizing Reflex Performance on Google Cloud Run

A Comparison of Gunicorn, Uvicorn, and Granian for Running Reflex Apps

Read More..

Simple CI/CD for Your FastAPI App with Google Cloud Build and Cloud Run

Simple CI/CD for Your FastAPI App with Google Cloud Build and Cloud Run

Push code, deploy automatically: A simple CI/CD guide for your web app.

Read More..

Running Database Migrations with Alembic in Google Cloud Build

Running Database Migrations with Alembic in Google Cloud Build

How to Organize and Load FastAPI Settings from a .env File Using Pydantic v2

Read More..

Contact Me

Have a project in mind? Send me an email at hello@davidmuraya.com and let's bring your ideas to life. I am always available for exciting discussions.

© 2025 David Muraya. All rights reserved.