I found on the airbyte forum how to backup the Airbyte instance to a new instance but I also used this resource to create a backup script of Airbyte.
For a migration from one place to another
Backup creation of the old instance
- Stop all the services
docker-compose down
- Start only the database
docker-compose up -d db
- Create the backup database file
docker exec airbyte-db pg_dump -U docker airbyte > airbyte_backup.sql
- Stop the service again
docker-compose down
When you’ve done that you have generated airbyte_backup.sql
file.
Backup load on the new instance
Go to your new server and start Airbyte using docker-compose up -d
after the service is running let’s stop to rebuild the database with docker-compose down
.
- Start only the database
docker-compose up -d db
- Drop the database created by Airbyte by default
docker exec airbyte-db psql -U docker -c 'drop database airbyte';
- Create the database to load the backup
`docker exec airbyte-db psql -U docker -c 'create database airbyte with owner docker;'`
- Then regenerate the
airbyte_backup.sql
into theairbyte
database.
cat airbyte_backup.sql| docker exec -i airbyte-db psql -U docker -d airbyte
- Then stop the database
docker-compose up -d
Please make sure to have put the backup file in the folder of your new Airbyte instance.
Backup script on Google Cloud Storage
I created this bash script in order to generate a airbyte_backup.sql
file and load it into a GCS bucket.
#!/bin/bash
# Get the current date
current_date=$(date +%F)
bucket_name="your_bucket_name"
# Stop all services
echo "Stopping all services..."
docker compose down
# Start only the database
echo "Starting only the database..."
docker compose up -d db
# Create the backup database file
echo "Creating the backup database file..."
docker exec airbyte-db pg_dump -U docker airbyte > airbyte_backup.sql
# Stop the service again
echo "Stopping all services again..."
docker compose down
# Upload the generated .sql to Google Cloud Storage (GCS)
echo "Uploading the generated .sql to Google Cloud Storage (GCS)..."
gsutil cp airbyte_backup.sql gs://$bucket_name/airbyte_db/$current_date/airbyte_backup.sql
# Remove the generated .sql file
echo "Removing the generated .sql file..."
rm airbyte_backup.sql
echo "Process completed."
./run-ab-platform.sh
It’ll drop it into your bucket-name in the airbyte_db
folder into a folder created with the date of the backup. Please make sure to replace with your GCS bucket name.
To run this script automatically you have to do this
- Open the crontab by doing
crontab -e
- Add this line into the file
0 12 * * * ~/airbyte/backup_script.sh
the 0 1 * * *
is the cron that I set (in my situation) it runs at 12pm and it’ll run the script at the following location ~/airbyte/backup_script.sh
.
Make sure to put the backup_script.sh in the same folder as your airbyte instance.