I really like Cog (previously) as a tool for automating aspects of my Python project documentation - things like the SQL schemas shown on the LLM logging page. …
Newly created GitHub repositories come with a default set of labels. I have several labels I like to add on top of these. The most important is research, which I use for issues that are tracking my notes on a research topic relevant to the repository. …
I'm trying a new thing: a private daily planner, where each day I note down my goals for the day and make notes on my progress towards them as the day progresses. …
I figured out how to serve a JavaScript project built using Vite using GitHub Pages and a custom build script that runs using GitHub Actions. …
My datasette-export-notebook plugin worked fine in the stable release of Datasette, currently version 0.64.3, but failed in the Datasette 1.0 alphas. Here's the issue describing the problem. …
I used to use a combination of actions/setup-python
and actions/cache
in all of my Python GitHub Actions projects in order to install Python dependencies via a cache, rather than hitting PyPI to download copies every time. …
I wanted to ensure that when this template repository was used to create a new repo that repo would have a specific set of labels. …
My simonwillisonblog-backup workflow periodically creates a JSON backup of my blog's PostgreSQL database, using db-to-sqlite and sqlite-diffable. It then commits any changes back to the repo using this pattern: …
My datasette-screenshots repository generates screenshots of Datasette using my shot-scraper tool, for people who need them for articles or similar. …
New feature announced here. Here's the full documentation. …
I've implemented this pattern a bunch of times now - here's the version I've settled on for my datasette-auth0 plugin repository. …
Some of my repositories have GitHub Actions workflows that execute commands using npx
, for example my graphql-scraper repo using npx
to install and run the get-graphql-schema
tool: …
I decided to run my CI tests against the Python 3.11 preview, to avoid the problem I had when Python 3.10 came out with a bug that affected Datasette. …
For my git-history live demos I needed to store quite large files (~200MB SQLite databases) in between GitHub Actions runs, to avoid having to recreate the entire file from scratch every time. …
For Datasette Desktop I wanted to run an action which, when I created a release, would build an asset for that release and then upload and attach it. …
The GitHub Actions ubuntu-latest
default runner currently includes an installation of PostgreSQL 13. The server is not running by default but you can interact with it like this: …
I wanted to run some Django tests - using pytest-django
and with Django configured to pick up the DATABASE_URL
environment variable via dj-database-url - against a PostgreSQL server running in GitHub Actions. …
I decided to adopt Prettier as the JavaScript code style for Datasette, based on my success with Black for Python code. …
I have a Django application which uses PostgreSQL. I build the Django application into its own Docker container, push that built container to the GitHub package registery and then deploy that container to production. …
I wanted to have a GitHub Action step run that might fail, but if it failed the rest of the steps should still execute and the overall run should be treated as a success. …
GitHub Actions workflows fail if any of the steps executes something that returns a non-zero exit code. …
markdown-toc is a Node script that parses a Markdown file and generates a table of contents for it, based on the headings. …
Say you have a workflow that runs hourly, but once a day you want the workflow to run slightly differently - without duplicating the entire workflow. …
This recipe runs a Python script to update a README, then commits it back to the parent repo but only if it has changed: …
From this example I learned that you can set environment variables such that they will be available in ALL jobs once at the top of a workflow: …
Useful for seeing what's available for if:
conditions (see context and expression syntax). …
Spotted in this Cloud Run example: …