• Docker

    ,

    Postgres

    🐘 Docker Postgres Autoupgrades

    Upgrading Postgres in Docker environments can be daunting, but keeping your database up-to-date is essential for performance, security, and access to new features. While there are numerous guides on manually upgrading Postgres, the process can often be complex and error-prone. Fortunately, the pgautoupgrade Docker image simplifies this process, automating the upgrade dance for us.

    The Challenge of Upgrading Postgres

    For many developers, upgrading Postgres involves several manual steps: backing up data, migrating schemas, ensuring compatibility, and testing thoroughly. Mistakes during these steps can lead to downtime or data loss, making the upgrade process a nerve-wracking experience.

    The pgautoupgrade Docker image is designed to handle the upgrade process seamlessly. Using it in place of the base Postgres image allows you to automate the upgrade steps, reducing the risk of errors and saving valuable time.

    How to Use pgautoupgrade

    While you can use the pgautoupgrade directly with Docker, I prefer it as my default development image.

    I set my compose.yml config with pgautoupgrade similar to this config:

    # compose.yml
    services:
      db:
        image: "pgautoupgrade/pgautoupgrade:latest"
        volumes:
          - postgres_data:/var/lib/postgresql/data/
    # ...
    

    Instead of using the latest version of Postgres, pgautoupgrade can be set to a specific version. This is nice if you want to match whichever version of Postgres you use in production or if you have extensions that might not be ready to move.

    # compose.yml
    services:
      db:
        image: "pgautoupgrade/pgautoupgrade:16-alpine"
        volumes:
          - postgres_data:/var/lib/postgresql/data/
    # ...
    

    Overall, I’m happy with pgautoupgrade. Please note that using pgautoupgrade does not mean you should not make data backups.

    See my last article, 🐘 A Just recipe to back and restore a Postgres database to learn some tips on how to automate using pg_dump and pg_restore.

    Saturday June 29, 2024
  • Justfiles

    ,

    Docker

    ,

    Postgres

    🐘 A Just recipe to backup and restore a Postgres database

    I have used this casey/just recipe to help backup and restore my Postgres databases from my Docker containers.

    I work with a few machines, and it’s an excellent way to create a database dump from one machine and then restore it from another machine. I sometimes use it to test data migrations because restoring a database dump takes a few seconds.

    I have been migrating from Docker to OrbStack, and the only real pain point is moving data from one volume to another. I sometimes need to switch between the two, so I have recipes set to back up and restore my database from one context to another.

    # justfile
    
    DATABASE_URL := env_var_or_default('DATABASE_URL', 'postgres://postgres@db/postgres')
    
    # dump database to file
    @pg_dump file='db.dump':
        docker compose run \
            --no-deps \
            --rm \
            db \
            pg_dump \
                --dbname "{{ DATABASE_URL }}" \
                --file /code/{{ file }} \
                --format=c \
                --verbose
    
    # restore database dump from file
    @pg_restore file='db.dump':
        docker compose run \
            --no-deps \
            --rm \
            db \
            pg_restore \
                --clean \
                --dbname "{{ DATABASE_URL }}" \
                --if-exists \
                --no-owner \
                --verbose \
                /code/{{ file }}
    

    Shoutout to Josh Thomas for help on this recipe since we both iterated on this for several projects.

    Friday June 28, 2024