Quick local API docs with Scalar

Today I’m sharing a quick-and-dirty script to take an OpenAPI description, spin up a docs server locally, and copy the URL into your clipboard. I also use a bit of glob expansion in my script to find the right folder, because I have a lot of APIs with long and formulaic directory names (TM Forum members know this story). I’m spinning up Scalar here as it’s my current favourite just-works local API docs platform.<--more-->

The script:

#!/usr/bin/env -S uv run --script

# /// script
# requires-python = ">=3.11"
# dependencies = [
#   "argparse",
# ]
# ///

import glob
import subprocess
import random

port = random.randint(0,500) + 3000;

# Add your own fudge/expansion magic here if you have APIs to find
path = f'../api-*/oas/*.oas.yaml' 
url = f'http://localhost:{port}'

files =  glob.glob(path)
if len(files) > 0:
  for f in files: # works for one match
    # copy URL to clipboard for convenience
    subprocess.run(["pbcopy"], input=url, text=True)
    subprocess.run(["scalar", "document", "serve", "-p", str(port), f])
else:
  print("API not found");

I’m using Python and I run this script with uv because I run this script from lots of different folders and it takes care of dependencies so neatly (confession: my actual script uses argparse and some inputs to feed the “guess the location of the API file from these three digits” magic but that part probably isn’t useful to other people!)

First I generate a random port number and use it to construct a URL – there’s no checking that the port is available but given that I rarely have more than 4 or 5 of these up at one time, it rarely collides and I just run the script again if it does!

The subprocess calls then run the commands I would actually run myself on the commandline if I didn’t use a script to generate a random port number and do some path fudging logic for me:

  • `pbcopy http://localhost:3333` (run this command first because the next one blocks)
  • scalar document serve -p 3333 openapi.yaml

Then I have the URL and can paste it into my browser. You could just open a browser tab instead of copying but I find it always picks the wrong tab/window/app/whatever so in my clipboard is more useful to me. I also use a clipboard history tool so it isn’t a problem to have lots of things writing to my copy buffer.

Feel free to adapt the script to fit your own paths and preferred tools; probably some of you do this often enough that it’s worth having a wrapper rather than remembering the correct incantation or relying on a hosted platform to download and upload to when the file is already local. And as always, if you do something differently, I’d love to hear about it!

API Specificity with Overlays and Enums

The more I work on API standards, the more I realise how few teams understand that they can adopt the standards and, without breaking any contract, adapt them to make a strong interface for their own application. One of my favourite examples is to add enums where a standard interface cannot dictate the exact values, but in your own implementation it is very helpful to do so. Continue reading

Markdown/Mermaid output for OpenAPI Arazzo

API reference documentation changed the way we built integrations, and eventually became part of the driving force for OpenAPI adoption and all the good tooling that flowed from it. As a developer experience specialist, I spend a lot of time thinking about how human users can work with the technical assets in a project. HTML-format API reference documentation does a great job of building that bridge when working on OpenAPI projects, but now I’m using Arazzo and it’s a very new standard with not nearly as many tools available for that format yet – so I built one.

From HTTP to OpenAPI with Optic

I’ve been using Optic’s CLI, an OpenAPI tool that does a bunch of things including diffing OpenAPI descriptions and comparing HTTP traffic with OpenAPI. My use case was an established API that didn’t have an OpenAPI file yet – using Optic we could create one as a starting point, and then move to a design-first workflow to make the changes that I was there to help with. For this blog post, I’ve used the example of https://api.joind.in as an excellent representation of an API still in use, but without an OpenAPI file and not built with code that a code generator would recognise. Continue reading

Save edits to OpenAPI as an Overlay

For teams that generate OpenAPI from their codebase, there’s a tough choice between maintaining rich and extensive content such as Markdown descriptions and examples in codebase annotations, or in making those changes to the generated file – and then losing them when the code changes and the file is regenerated. The new OpenAPI Overlay Specification defines a format for storing updates to an OpenAPI document, and there’s a new generation of tools to make it easy to do, so let’s take a look. Continue reading

Run GitHub Actions on Subdirectories

I come across a lot of “greedy” GitHub Actions, where automation is running across a whole project instead of only on the parts that are relevant. Examples might be code linters that report problems with documentation folders, or the inverse of that. It’s especially problematic in monorepos where we probably want to use the same tool when we’re doing the same task for different subfolders, but that tool might not make sense to run everywhere. Continue reading

Use multi-line values in GitHub Actions

I created an action that needed a rich Markdown value in it, because it’s our weekly meeting agenda template which is formatted for humans with links and paragraphs and things. The Action syntax produced errors when trying to add the content directly to the action, but I got it to work by putting the content into the file, and using the file contents as an environment variable. That’s really the punchline of this post, but read on if you would like more details and some examples. Continue reading

Pretty-print JSON with jq

Wrangling some document conversion the other day, I ended up in a situation where I had the JSON I needed, but in a completely unreadable format. Luckily, this problem is very easily fixable …. when you know how. So today’s post is a quick recap on how I did that using jq, a very handy command-line tool for working with JSON. For the impatient, here’s the command:

cat posts.json | jq "." > better.json

In this post we’ll look at the data I started with and what the different bits of the command to do help. Continue reading

API Description Pipelines

Working on API tools, I get to see inside lots of different organisations’ API projects and processes. Every scenario is different, but a common theme is that many companies use a more complicated API description workflow than you see in conference slide decks! This article shares my typical workflow, steps and chosen tools that might show up in an API description pipeline. Continue reading

Checking Links in Docs-As-Code Projects

Creating content requires accuracy as well as creativity and the ability to deliver. Working with docs-as-code gives a strong foundation and structure to work within, and means there’s a clear workflow where automation can be added to help us with the easy stuff, such as “do all these links work?”.

I really appreciate having the extra confidence and support of these types of tools, and having implemented them on multiple projects at this point, I thought I’d share my advice for anyone looking to do the same. Continue reading