Teach Your API Test Platform to Send Callbacks
Describing Callbacks in OpenAPI
I’m working with the Nexmo APIs quite a bit so here’s an easy example from one of their specs. It’s from the “Number Insight” API – you provide a phone number and the API returns you information about that number. There’s a few different levels of information (and different levels of cost for each), but the “Advanced” level is pretty unreliable if you call it synchronously – so instead Nexmo will respond to your API call, then send a followup request to the URL you specify with a payload of all the data in it.
Here’s how the OpenAPI spec looks:
"/advanced/async/{format}":
parameters:
- $ref: "#/components/parameters/format"
get:
operationId: getNumberInsightAsync
summary: Advanced Number Insight (async)
parameters:
- $ref: "#/components/parameters/callback"
- $ref: "#/components/parameters/number"
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: "#/components/schemas/niResponseAsync"
callbacks:
onData:
"{$request.query.callback}":
post:
operationId: asyncCallback
summary: Asynchronous response
description: Contains the response to your Number Insight Advanced API request.
requestBody:
content:
application/json:
schema:
$ref: "#/components/schemas/niResponseJsonAdvanced"
responses:
"200":
description: OK
(if you want to see the whole thing, it’s on GitHub )
The callbacks
section details the incoming requests to expect as a followup to the client’s API call, and what response to send. In the initial request, the callback
parameter specifies where the callback
will be directed to – and Prism can do that too.
Mocking The API and Callbacks
I can start up the “pretend” API with Prism like this:
prism mock number-insight.yml
Prism will output the endpoints that it got from the spec and then report its location – for me that is usually http://localhost:4010
.
In order to receive the callback, I’ll need to pass a URL that Prism can reach. This is one scenario where you won’t need a public URL to test a webhook – because the webhook is coming from another local tool. In this case, I’ve got a simple webserver running on port 8080 that will simply output the data it receives so I can inspect it.
Ready? Here’s my curl command to request the async endpoint from the local mock API, and gives the callback to send the data payload to:
curl "http://localhost:4010/advanced/async/json?api_key=123&api_secret=456&number=44777000777&callback=http://localhost:8080"
I get a response to my API call, containing a request_id
and some other information, then (but seemingly instantly) Prism follows up by sending more example data to the callback
URL I specified before.
Testing API Callbacks Locally
This setup is brilliant for a situation like this where I want to develop something to handle this callback – but I don’t want to keep on hitting the live API (not to mention that I will run out of credit if I keep hitting this one!). I can work locally, regardless of whether my connectivity is great or even whether the API is up (or even exists yet).
In the future, Prism is expected (by me at least) to also have support for the new Webhooks feature that is slated for OpenAPI 3.1 – enabling us to use a local mock server to send incoming requests as well as handling our outgoing ones. I, for one, can’t wait!
Also published on Medium.