Running Datasette on DigitalOcean App Platform

App Platform is the new PaaS from DigitalOcean. I figured out how to run Datasette on it.

The bare minimum needed is a GitHub repository with two files: requirements.txt and Procfile.

requirements.txt can contain a single line:

datasette

Procfile needs this:

web: datasette . -h 0.0.0.0 -p $PORT --cors

Your web process needs to listen on 0.0.0.0 and on the port in the $PORT environment variable.

Connect this GitHub repository up to DigitalOcean App Platform and it will deploy the application - detecting that it's a Python application (due to the requirements.txt file), installing those requirements and then starting up the process in the Procfile.

Any SQLite .db files that you add to the root of the GitHub repository will be automatically served by Datasette when it starts up.

Because Datasette is run using datasette . it will also automatically pick up a metadata.json file or anything in custom templates/ or plugins/ folders, as described in Configuration directory mode in the documentation.

Building database files

I don't particularly like putting binary SQLite files in a GitHub repository - I prefer to store CSV files or SQL text files and build them into a database file as part of the deployment process.

The best way I've found to do this in a DigitalOcean App is to create a build.sh script that builds the database, then execute it using a Build Command.

One way to do this is to visit the "Components" tab end click "Edit" in the Commands section, then set the "Build Command" to . build.sh. Now any code you add to a build.sh script in your repo will be executed as part of the deployment.

A better way (thanks, Kamal Nasser) is to use a bin/pre_compile or bin/post_compile script in your repository.

I started with a build.sh script that looked like this:

wget https://latest.datasette.io/fixtures.db

And this resulted in the fixtures.db folder being served at /fixtures under my app's subdomain.

Created 2020-10-06T19:45:25-07:00, updated 2020-10-07T07:29:46-07:00 · History · Edit