How to deploy a folder with a Dockerfile to Cloud Run

I deployed by creating a folder on my computer containing a Dockerfile and then shipping that folder up to Google Cloud Run.

Normally I use datasette publish cloudrun to deploy to Cloud Run, but in this case I decided to do it by hand.

I created a folder and dropped two files into it: a Dockerfile and a metadata.json. BUT... this trick would work with more files in the same directory - it uploads the entire directory contents to be built by Google's cloud builder.


FROM python:3.6-slim-stretch
RUN apt update
RUN apt install -y python3-dev gcc wget
ADD metadata.json metadata.json
RUN wget -q ""
RUN pip install datasette
RUN datasette inspect MetObjects.db --inspect-file inspect-data.json


CMD datasette serve MetObjects.db --host --cors --port $PORT --inspect-file inspect-data.json -m metadata.json

The PORT is provided by Cloud Run. It's 8080 but they may change that in the future, so it's best to use an environment variable.

Here's the metadata.json:

    "title": "The Metropolitan Museum of Art Open Access",
    "source": "metmuseum/openaccess",
    "source_url": "",
    "license": "CC0",
    "license_url": ""

Finally here's my script which I used to run the deploy. This needs to be run from within that directory:

PROJECT=$(gcloud config get-value project)

gcloud builds submit --tag $IMAGE
gcloud run deploy --allow-unauthenticated --platform=managed --image $IMAGE $NAME --memory 2Gi

Before running the script I had installed the Cloud Run SDK and run gcloud init to login.

The NAME variable ends up being used as the name of both my built image and my service. This needs to be unique in your Cloud Run account, or your deploy will over-write an existing service.

Cloud Run deployed my site to

I then used the "Domain mappings" feature in the Cloud Run web console to point a better web address at it.

Created 2020-08-04T20:36:31-07:00, updated 2020-12-29T13:55:23-08:00 · History · Edit