Data backup recommendations

Each time you perform an upgrade, our upgrade script takes back up for the Postgres database on your environment. However, we will recommend that you perform a daily backup of the underlying storage system that Tines compute is running, and in a separate system.

Step 1: Copy the data to a backup folder 

The first step is to copy the data to a file, which will then be transferred later. The exact instructions will vary depending on how your environment is configured, but here are some examples.

Example 1: using pg_dump 

If you have the PostgreSQL client utility pg_dump , you can generate a dump file following instructions here.

Example 2: copying the data directory 

Make a backup directory if it doesn't exist, e.g., mkdir db-backup.

Copy the data to this new directory, e.g., cp -r /var/lib/docker/volumes/tines_db-data/_data/ ./db-backup/2025-01-01/. You may need to change the name of the Tines database directory if you've configured it with a different name.

Step 2: Transfer the file to a separate system 

You can now transfer the backup folder to a separate environment. There is an example below of how you could do this.

Example: using scp 

You can use secure copy to transfer to a remote server. E.g., scp ./db-backup/2025-01-01/ backup@remote-server:/backup/2025-01-01.

Step 3: Automating this backup 

The first two steps were manual ones to create a backup. In order to automate this process on a recommended daily basis, this could be set up to run automatically. Again, this is dependent on what platform you're on, but one way to implement this would be cron jobs on linux using a bash script with the commands above.

Was this helpful?