simeonGriggs.dev
Twitter
GitHub

Backup Sanity to AWS S3

Headless CMS's are refreshingly "set and forget". But if the Risk Management Department needs to be satiated by having your own backup solution, it's easy to do.

View in Sanity Studio

2020-12-11

Sanity's CLI will allow you to make backups of a project. But only locally and on demand.

I needed an App to remotely and programatically generate backups, and S3 is as good a place as any to park them.

And since Sanity's CLI export is a package on its own, we can write a simple app to leverage it.

(Sanity.io is super reliable and backed by Google Cloud Platform ... but it's a good idea to have your own regular, automated backups!)

Check it out:

Sanity Backup to S3 on GitHub

What I have written here is a simple Express application which takes a number of parameters as a path. Like this:

http://localhost:5501/SANITY_PROJECT_ID/DATASET/BUCKET_NAME/EMAIL

You can clone the repo and run locally to generate backups. Or deploy to Heroku (or similar) to trigger them remotely (like from a cron).

Get setup

From Sanity

You can find or create these details from the Sanity Manage page

  1. A Sanity Project ID
  2. The name of the dataset you'll be backing up
  3. An API Token on the project with 'read' access

The API Token is stored in .env with a unique key. That is the Project ID in ALL CAPS and _TOKEN. This way you can backup multiple projects using the one app, so long as a matching API Token exists in .env

Example: If your Project ID is asdf123, your API Token should be stored as ASDF123_TOKEN='...'

From AWS S3

Create a new Bucket in the S3 Console and find the Access Key and Secret Access Key of an IAM User that has permissions to upload files to it.

So now you'll have:

  1. ACCESS_KEY and SECRET_ACCESS_KEY in .env
  2. The name of the destination bucket

From Mailgun

As an extra feature I have added some basic Mailgun support to the App to deliver a notification along with a download link of the backup.

This last parameter in the path is optional, so if it's enough to just upload the backup to S3 you can ignore this step.

After setting up your Mailgun account add the following to your .env

  1. MAILGUN_API,
  2. MAILGUN_DOMAIN and
  3. MAILGUN_HOST (possibly optional, required to deliver from EU servers on Mailgun)

Note: If you do include an email, the backup will be uploaded to S3 with 'public-read' permissions. Otherwise it will be set to 'private'.

Get going

Your .env file should now look like the .env.example file ... but with actual values in it!

ACCESS_KEY=''
SECRET_ACCESS_KEY=''
ASDF123_TOKEN=''
MAILGUN_API=''
MAILGUN_DOMAIN=''
MAILGUN_HOST=''

Run npm install in the directory of the app and then npm startto get started. You'll be prompted for to visit a URL.

For example, in our Sanity Project asdf123, to backup the production dataset to our sanity-backups S3 bucket and email [email protected] you would visit:

http://localhost:5501/asdf123/production/sanity-backups/[email protected]

Go live

I have tested this on small-ish datasets on a free Heroku dyno and it is reasonably reliable. Your mileage may vary though if you're trying to regularly export multiple or really large datasets.

It's really basic and has plenty of scope for improvement. But it does the job.

Did you find this App helpful? Let me know!