Accepting file uploads is a fundamental feature for many web applications, whether it's for user avatars, document submissions, or media content. A naive approach of reading the entire file into memory works for small files but quickly leads to performance bottlenecks and server crashes when faced with larger uploads.
This guide provides a practical approach to building a robust file upload endpoint in FastAPI. We'll start with a simple upload, then cover how to stream large files directly to disk to keep memory usage low, validate file types and sizes, and handle uploads efficiently in a production environment.
For small files, like user avatars, reading the entire file into memory is a straightforward approach. FastAPI makes this easy with its UploadFile and File components.
UploadFile: This is the main class for working with uploaded files. It's an async-compatible wrapper that provides methods like .read(), .write(), and .seek().File(): This is a dependency function you use in your path operation to declare that you are expecting file data as part of a multipart/form-data request.Here is an example of a basic endpoint that saves an uploaded file directly to a directory:
# filepath: main.py import aiofiles import os import uuid from fastapi import FastAPI, File, UploadFile, HTTPException app = FastAPI() UPLOAD_DIR = "uploads" # Ensure the upload directory exists os.makedirs(UPLOAD_DIR, exist_ok=True) @app.post("/upload/simple") async def upload_simple(file: UploadFile = File(...)): """ Saves a file to the upload directory with a secure, unique filename. """ if file.content_type not in {"image/jpeg", "image/png"}: raise HTTPException(status_code=415, detail="Unsupported file type.") # Generate a secure, unique filename to prevent path traversal and overwrites file_extension = os.path.splitext(file.filename)[1] secure_filename = f"{uuid.uuid4()}{file_extension}" file_path = os.path.join(UPLOAD_DIR, secure_filename) try: async with aiofiles.open(file_path, "wb") as out_file: content = await file.read() # Read file content await out_file.write(content) # Write to disk except Exception as e: raise HTTPException(status_code=500, detail=f"Error saving file: {e}") finally: # Ensure the temporary file is closed await file.close() return { "filename": file.filename, "content_type": file.content_type, "stored_at": file_path, }
This method is simple, but it has a major drawback: await file.read() loads the entire file into RAM. This is not scalable and can easily crash your application.
Just like with downloads, reading an entire uploaded file into memory before processing it is a bad practice for anything other than very small files.
await file.read(), your server's memory usage will spike by 500 MB for that single request. This does not scale and can easily crash your application under concurrent loads.Streaming the upload directly to a file on disk solves these problems by processing the file in small, manageable chunks. This keeps memory usage low and constant, regardless of the file size.
Let's build a more robust endpoint that securely accepts a file, validates it, and streams it to a designated upload directory. This approach is memory-efficient and suitable for files of any size.
# filepath: main.py import aiofiles import os from fastapi import FastAPI, File, UploadFile, HTTPException, Request app = FastAPI() UPLOAD_DIR = "uploads" MAX_FILE_SIZE = 1024 * 1024 * 25 # 25 MB ALLOWED_MIME_TYPES = {"image/jpeg", "image/png", "application/pdf"} # Ensure the upload directory exists os.makedirs(UPLOAD_DIR, exist_ok=True) @app.post("/upload/stream") async def upload_stream(request: Request, file: UploadFile = File(...)): """ Streams a file to disk with size and content-type validation. """ # 1. Validate Content-Type if file.content_type not in ALLOWED_MIME_TYPES: raise HTTPException(status_code=415, detail="Unsupported file type.") # 2. Enforce file size limit by checking Content-Length header content_length = request.headers.get("content-length") if not content_length: raise HTTPException(status_code=411, detail="Content-Length header required.") file_size = int(content_length) if file_size > MAX_FILE_SIZE: raise HTTPException(status_code=413, detail=f"File too large. Limit is {MAX_FILE_SIZE / 1024 / 1024} MB.") # 3. Stream file to disk file_path = os.path.join(UPLOAD_DIR, file.filename) try: async with aiofiles.open(file_path, "wb") as out_file: while chunk := await file.read(1024 * 1024): # Read in 1 MB chunks await out_file.write(chunk) except Exception as e: # Clean up partially written file in case of an error if os.path.exists(file_path): os.remove(file_path) raise HTTPException(status_code=500, detail=f"Error saving file: {e}") finally: # Ensure the temporary file is closed await file.close() return {"filename": file.filename, "stored_at": file_path}
In this code:
UPLOAD_DIR and ensure it exists.MAX_FILE_SIZE and a set of ALLOWED_MIME_TYPES for validation.file.content_type against our allowed list.Content-Length header to reject oversized files early, before any data is processed.aiofiles, which provides an async interface for file operations.An uploaded file is more useful with context. Storing metadata - such as who uploaded the file, when it was uploaded, and its original content type - in a database is essential for tracking and management.
A file checksum (like SHA-256) is also highly valuable. It's a unique signature for the file's content, which allows you to:
Here is a function to generate a SHA-256 checksum for a file after it has been saved to disk:
import hashlib from pathlib import Path def generate_sha256_checksum(file_path: Path) -> str: """Generates a SHA-256 checksum for a file.""" sha256_hash = hashlib.sha256() with open(file_path, "rb") as f: # Read and update hash in chunks to handle large files for byte_block in iter(lambda: f.read(4096), b""): sha256_hash.update(byte_block) return sha256_hash.hexdigest() # After saving the file in your endpoint: # checksum = generate_sha256_checksum(Path(file_path)) # Then, store the checksum and other metadata in your database: # INSERT into uploads(user_id, path, checksum, content_type, created_at) ...
1. How do I handle multiple file uploads in a single request?
You can type-hint the file parameter as a list[UploadFile]. FastAPI will automatically handle parsing multiple files from the form data.
from typing import List @app.post("/upload/multiple") async def upload_multiple(files: List[UploadFile] = File(...)): for file in files: # Process each file as in the single upload example ... return {"status": f"{len(files)} files uploaded."}
2. Is checking the content-type header enough to be secure?
No. The content-type header is sent by the client and can be easily spoofed. For higher security, you should validate the file's "magic numbers" (the first few bytes of the file that identify its type).
A robust library for this is python-magic. You can read the first few bytes of the uploaded file, determine its true MIME type, and then rewind the file to be streamed to disk.
import magic from fastapi import FastAPI, File, UploadFile, HTTPException # In a real app, you would have this defined elsewhere ALLOWED_MIME_TYPES = {"image/jpeg", "image/png", "application/pdf"} @app.post("/upload/validate") async def upload_and_validate(file: UploadFile = File(...)): # Read the first 2048 bytes to determine the file type header = await file.read(2048) await file.seek(0) # Rewind the file to the beginning for the next read # Use python-magic to get the MIME type from the header bytes mime_type = magic.from_buffer(header, mime=True) if mime_type not in ALLOWED_MIME_TYPES: raise HTTPException( status_code=415, detail=f"Unsupported file type: {mime_type}. Allowed types are {', '.join(ALLOWED_MIME_TYPES)}." ) # If valid, proceed to stream the file to disk as shown previously # ... return {"status": "File is valid", "mime_type": mime_type}
To use python-magic, you need to install both the Python package and its underlying C library dependency, libmagic.
Python package: pip install python-magic
System Dependency (libmagic):
pip install python-magic-bin.apt-get update && apt-get install -y libmagic1.brew install libmagic.Additionally, ensure your upload endpoint is protected by a robust authentication system.
3. How can I show upload progress on the frontend?
You can use XMLHttpRequest in JavaScript to monitor the upload.onprogress event. This event provides the loaded and total bytes, allowing you to calculate and display a real-time progress bar.
const fileInput = document.querySelector('input[type=file]'); const file = fileInput.files[0]; const formData = new FormData(); formData.append('file', file); const xhr = new XMLHttpRequest(); xhr.open('POST', '/upload/stream', true); xhr.upload.onprogress = (event) => { if (event.lengthComputable) { const percentComplete = Math.round((event.loaded / event.total) * 100); console.log(`Upload Progress: ${percentComplete}%`); } }; xhr.send(formData);
4. What are presigned URLs and should I use them? For very large files or applications with high upload traffic, streaming through your API can still be a bottleneck. The presigned URL pattern solves this by letting the client upload directly to cloud storage (like Amazon S3 or Google Cloud Storage). Your API's role is just to generate a secure, short-lived URL that grants the client temporary write access. This offloads the heavy lifting from your server entirely and is a common pattern for high-performance applications deployed on platforms like Google Cloud Run.
About the Author
David Muraya is a Solutions Architect specializing in Python, FastAPI, and Cloud Infrastructure. He is passionate about building scalable, production-ready applications and sharing his knowledge with the developer community. You can connect with him on LinkedIn.
Related Blog Posts
Enjoyed this blog post? Check out these related posts!

How to Create and Secure PDFs in Python with FastAPI
A Guide to Generating and Encrypting PDFs with WeasyPrint, pypdf, and FastAPI
Read More...

How to Protect Your FastAPI OpenAPI/Swagger Docs with Authentication
A Guide to Securing Your API Documentation with Authentication
Read More...

A Practical Guide to FastAPI Security
A Comprehensive Checklist for Production-Ready Security for a FastAPI Application
Read More...

How to Handle File Downloads in FastAPI
Efficient File Downloads and Progress Bars with FastAPI's FileResponse
Read More...
Contact Me
Have a project in mind? Send me an email at hello@davidmuraya.com and let's bring your ideas to life. I am always available for exciting discussions.