How To Setup Automated Backups For Umami

No Comments
Published: 18.09.2022

Are you interested in setting up automated backups for umami? In this post, I will show you why and how to do it! The setup is based on a self-hosted docker version that I explained in an earlier post!


The foundation for this post is the Umami setup in docker from this post here. In addition to the containers there, we need to create a companion container for the PostgreSQL database to take care of the backups.

But why should you back up your statistics data? First of all, because data loss can happen at any time if by accident or someone wanting to delete your data. With the help of backups, you can recover from such a data loss, most of the time with a view states back in time.

In the next section, we will set up the companion container.

Automated Backups for Umami

Before we start setting up the databases and the backup of the data we will create the container configurations that are needed for umami. Therefore we create a docker-compose.yml file in the directory we like. Inside the file we will configure the umami container:

        container_name: umami
            - "3000:3000"
            DATABASE_URL: postgresql://umami:umami123@umami-db:5432/umami
            DATABASE_TYPE: postgresql
            HASH_SALT: <random-string>
            - umami-db
        restart: always

After that, we will first create the PostgreSQL database that stores all the configuration and the statistics data that you collect. For that, we will create the umami-db container and the pg-backup container that is based on the postgres-backup-local image which allows for automated and scheduled backups.

Need help or want to share feedback? Join my discord community!

        container_name: umami-db
        image: postgres:12-alpine
            POSTGRES_DB: umami
            POSTGRES_USER: umami
            POSTGRES_PASSWORD: umami123
            - ./umami/data:/var/lib/postgresql/data
            - ./umami/backup:/var/lib/postgresql/backup/
        restart: always

        container_name: pg-backup
        image: prodrigestivill/postgres-backup-local
        restart: always
            - ./umami/backup/:/backups/
            - umami-db:umami-db
            - umami-db
            - POSTGRES_HOST=umami-db
            - POSTGRES_DB=umami
            - POSTGRES_USER=umami
            - POSTGRES_PASSWORD=umami123
            - POSTGRES_EXTRA_OPTS=-Z9 --schema=public --blobs -a
            - SCHEDULE=@daily
            - BACKUP_KEEP_DAYS=14
            - BACKUP_KEEP_WEEKS=4
            - BACKUP_KEEP_MONTHS=6
            - HEALTHCHECK_PORT=81

With the environment variables of the pg-backup container, you can configure the automated backups easily. For example, with the current configuration, I will create a backup once a day and store it for 14 days. In addition to the daily backups the image also automatically generates monthly and weekly backups. In addition to these variables, you can find all of them here.

The backups can be found in ./umami/backup.


If this guide is helpful to you and you like what I do, please support me with a coffee!

With this, we set up automated backups for umami. Now let’s have a look at how to recover the data in case of a data loss!

Recover the data from Backups

In this section, we will have a look at how to recover the data of the PostgreSQL database:

  1. docker stop umami-db
  2. docker rm umami-db
  3. rename the old data folder
  4. docker compose up -d umami-db
  5. docker restart umami
  6. check in the browser if everything works as expected (no websites there)
  7. docker exec -it umami-db bash -c "zcat /var/lib/postgresql/backup/<backup-dir>/<backup-file>.sql.gz | psql --username=umami --dbname=umami -W"

With this, you can recover data after a data loss!


In this post, we created automated backups for umami by creating a companion container. We learned how to set up the backup creation and how to recover the data in case of a data loss!

I hope this post helped you set up automated backups and that keeps you safe from the struggle that I had when losing my data.

In case you liked this post consider subscribing to my newsletter to get monthly updates on all of my posts!

Discussion (0)

Add Comment

Your email address will not be published. Required fields are marked *