HomeBlog

Running Database Migrations with Alembic in Google Cloud Build

6 min read
Alembic migrations in Google Cloud Build pipeline

Keeping your database schema in sync with your application code is important. Alembic is a tool that helps you manage database migrations for SQLAlchemy projects. It tracks changes to your models and lets you apply those changes to your database in a controlled way.

If you're interested in setting up a full CI/CD pipeline for FastAPI on Google Cloud Run, check out my article: How to build a CI/CD pipeline on Google Cloud Run for a FastAPI application.

If you need a guide on connecting FastAPI to a SQL database and building CRUD APIs, see: Connecting FastAPI to a Database with SQLModel.

What is Alembic?

Alembic is a lightweight database migration tool for Python. It works with SQLAlchemy and lets you create migration scripts when your models change. These scripts can add or remove tables, columns, or indexes. You run Alembic commands to apply these changes to your database.

How Alembic Works

Alembic keeps track of your database schema in a folder called versions under the alembic folder. When you change your models, you generate a new migration file. Alembic compares your models to the current database and writes out the changes. You then run Alembic to apply the migration.

Generating a Migration

When you update your models, you need to create a migration file. Alembic can detect changes automatically. Here’s how you do it:

alembic revision --autogenerate -m "Describe your change here"

This command creates a new migration script with the changes Alembic finds. Always check the generated file to make sure it looks right before applying it.

Applying Migrations in Google Cloud Build

You can automate database migrations in your deployment pipeline using Google Cloud Build. Here’s a sample cloudbuild.yaml file that shows how to do this:

steps:
  # Step 1: Build the Docker image
  - name: 'gcr.io/cloud-builders/docker'
    args: [
      'build',
      '-t', 'gcr.io/$PROJECT_ID/davidmuraya:$COMMIT_SHA',
      '.',
      '--build-arg', 'VITE_BACKEND_HOST='
    ]

  # Step 2: Run Alembic migrations using the built image
  - name: 'gcr.io/$PROJECT_ID/davidmuraya:$COMMIT_SHA'
    entrypoint: 'alembic'
    args: ['upgrade', 'head']
    secretEnv: ['POSTGRES_USER', 'POSTGRES_PASSWORD', 'POSTGRES_HOST', 'POSTGRES_PORT', 'POSTGRES_DATABASE_NAME']

  # Step 3: Push the Docker image
  - name: 'gcr.io/cloud-builders/docker'
    args: ['push', 'gcr.io/$PROJECT_ID/davidmuraya:$COMMIT_SHA']

  # Step 4: Deploy to Cloud Run
  - name: 'gcr.io/cloud-builders/gcloud'
    args:
      - 'run'
      - 'deploy'
      - 'davidmuraya'
      - '--image=gcr.io/$PROJECT_ID/davidmuraya:$COMMIT_SHA'
      - '--set-env-vars=ENVIRONMENT=PRODUCTION'
      - '--region=us-central1'
      - '--project=$PROJECT_ID'

  # Step 5: Update traffic to latest revision
  - name: 'gcr.io/cloud-builders/gcloud'
    args:
      - 'run'
      - 'services'
      - 'update-traffic'
      - 'davidmuraya'
      - '--to-latest'
      - '--region=us-central1'
      - '--project=$PROJECT_ID'

availableSecrets:
  secretManager:
    - versionName: projects/$PROJECT_ID/secrets/POSTGRES_USER/versions/latest
      env: 'POSTGRES_USER'
    - versionName: projects/$PROJECT_ID/secrets/POSTGRES_PASSWORD/versions/latest
      env: 'POSTGRES_PASSWORD'
    - versionName: projects/$PROJECT_ID/secrets/POSTGRES_HOST/versions/latest
      env: 'POSTGRES_HOST'
    - versionName: projects/$PROJECT_ID/secrets/POSTGRES_PORT/versions/latest
      env: 'POSTGRES_PORT'
    - versionName: projects/$PROJECT_ID/secrets/POSTGRES_DATABASE_NAME/versions/latest
      env: 'POSTGRES_DATABASE_NAME'

options:
  logging: CLOUD_LOGGING_ONLY

Note: Replace davidmuraya with your actual project name or service name as needed.

In Step 2, the pipeline runs Alembic migrations using the Docker image you just built. The command alembic upgrade head applies all pending migrations to your database.

Managing Secrets with Google Secret Manager

In this previous article, I explained how to use a .env file to manage your FastAPI configuration settings. While .env files are useful for local development, you should never commit them to your version control or GitHub repository in production. Sensitive information like database credentials should be kept secure.

For production deployments, use Google Secret Manager to store secrets safely and inject them as environment variables during your build and deployment process. In the example above, secrets like POSTGRES_USER and POSTGRES_PASSWORD are pulled from Secret Manager and made available to Alembic. This keeps your secrets out of your codebase and version control, reducing the risk of accidental exposure.

For a step-by-step guide on securely managing environment variables in FastAPI with Google Secret Manager and Cloud Run, see Secure FastAPI Environment Variables on Cloud Run with Secret Manager.

To add a secret, use the Google Cloud Console or CLI. Then reference it in your cloudbuild.yaml under availableSecrets.

Best Practices

  • Always review migration files before applying them, especially if you use --autogenerate.
  • Run migrations before deploying your app to avoid schema mismatches.
  • Store secrets in Secret Manager, not in your code or Docker images.
  • Use descriptive messages for your migrations so you know what changed.

Conclusion

Alembic makes it easy to manage database schema changes. By automating migrations in your Google Cloud Build pipeline, you keep your database in sync with your code and avoid manual steps. Use Secret Manager for credentials, and always check your migrations before running them. This approach is simple, reliable, and works well for most projects.

About the Author

David Muraya is a Solutions Architect specializing in Python, FastAPI, and Cloud Infrastructure. He is passionate about building scalable, production-ready applications and sharing his knowledge with the developer community. You can connect with him on LinkedIn.

Related Blog Posts

Enjoyed this blog post? Check out these related posts!

How to Set Up Logging in FastAPI

How to Set Up Logging in FastAPI

A clear, practical guide to logging in FastAPI, covering log levels, configuration, and best practices for real-world applications.

Read More..

Deploying Reflex Front-End with Caddy in Docker

Deploying Reflex Front-End with Caddy in Docker

A step-by-step guide to building and serving Reflex static front-end files using Caddy in a Docker container

Read More..

Reflex Makes SEO Easier: Automatic robots.txt and sitemap.xml Generation

Reflex Makes SEO Easier: Automatic robots.txt and sitemap.xml Generation

Discover how adding your deploy URL in Reflex automatically generates robots.txt and sitemap.xml for easier SEO.

Read More..

Contact Me

Have a project in mind? Send me an email at hello@davidmuraya.com and let's bring your ideas to life. I am always available for exciting discussions.

© 2025 David Muraya. All rights reserved.