API Tests Using n3 Rules
-
Follow platform-specific instructions to install locally oor in a docker image.
To run
api-tunerin GitHub workflow, you add this action to your jobs:- uses: fabasoad/setup-prolog-action@v1
-
curl 7.83+
npm i api-tuner
> api-tuner --help
Usage: api-tuner [options] <path>...
Options:
--lib <path> Specify rules to include in all tests. Can be used multiple times. Make sure to surround globs in quotes to prevent expansion.
--silent Less output
--debug Enable debug output
--raw Output raw results from eye
--base-iri <iri> Specify the base IRI for parsing the test case files
--version Show version information
--help Show this help message
Create a test case file test.n3:
# test.n3
PREFIX earl: <http://www.w3.org/ns/earl#>
PREFIX tuner: <https://api-tuner.described.at/>
prefix resource: <https://api-tuner.described.at/resource#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
PREFIX string: <http://www.w3.org/2000/10/swap/string#>
<#getExampleDotCom>
a earl:TestCase ;
rdfs:label "Simple GET test" ;
tuner:formula {
# Execute the request and capture its response
( <http://localhost:1080/example.com> ?res ) resource:getIn [] .
# Check the response status code and content type
?res tuner:http_code 200 ;
tuner:header ( "content-type" "text/html" ) ;
.
# Check the body contains the work "Example"
?res tuner:body ?body .
?body string:contains "Example Domain" .
} ;
.
Execute the test case:
api-tuner test.n3See the Documentation section below for more examples and detailed documentation of the N3 rules.
This section describes the N3 rules and vocabulary used by api-tuner for defining and executing API tests.
Commonly used prefixes in api-tuner tests:
PREFIX tuner: <https://api-tuner.described.at/>
PREFIX resource: <https://api-tuner.described.at/resource#>
PREFIX earl: <http://www.w3.org/ns/earl#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
PREFIX log: <http://www.w3.org/2000/10/swap/log#>
PREFIX string: <http://www.w3.org/2000/10/swap/string#>A test case is defined as an earl:TestCase. The core logic of the test resides in tuner:formula.
<#myTest>
a earl:TestCase ;
rdfs:label "Description of my test" ;
tuner:formula {
# Test logic goes here
} .If the formula evaluates to true (all statements inside match), the test is considered earl:passed.
You can define a detailed request using the tuner:Request class.
<#test> tuner:request [
a tuner:Request ;
tuner:method "POST" ;
tuner:url <http://example.com/api> ;
tuner:header ( "Accept" "application/json" ) ;
tuner:query ( "verbose" "true" ) ;
tuner:body { <#s> <#p> <#o> }
] .To execute the request and get a response:
<#test> tuner:request ?req .
?req tuner:response ?res .The resource: namespace provides shortcuts for common HTTP methods. These helpers automatically assert a 200 OK response status unless used in a way that captures the response for further assertions.
( <url> ?res ) resource:getIn []( <url> ?body ?res ) resource:postIn []( <url> ?res ) resource:postIn [](no body)( <url> ?body ?res ) resource:putIn []
Example:
( <http://example.com> ?res ) resource:getIn [] .api-tuner supports different types of request bodies:
- Inline RDF: Uses an N3 formula. It is serialized as Turtle and sent with
Content-Type: text/turtle.tuner:body { <#s> <#p> <#o> } - File Reference: Sends the contents of a local file.
tuner:body <file:data.json> - Multipart Form:
tuner:body [ tuner:form ( "field1" "value1" ) ; tuner:form ( "fileField" <file:photo.jpg> "image/jpeg" ) ; ]
Query parameters can be added to a request using tuner:query.
?req tuner:query ( "name" "value" ) .Assertions are performed on the tuner:Response object (usually captured in a variable like ?res).
Assert the HTTP status code:
?res tuner:http_code 200 .Assert the presence and value of an HTTP header. Header names are case-insensitive.
- Exact match:
?res tuner:header ( "Content-Type" "application/json" ) . - Regex match:
?res tuner:header ( "Content-Type" "application/.*" string:matches ) .
-
Raw body string:
?res tuner:body ?body . ?body string:contains "Expected Text" .
-
RDF Semantics: If the response is RDF, you can use
log:includesto check its content.?res tuner:body ?body. ?body log:includes { <#s> <#p> <#o> } .
⚠️ Be careful when using?res!log:includesresource path shorthand which will not work insidetuner:formula. Please refer to this discussion. -
JSON Path: If the response is JSON, you can use
tuner:jsonPathto assert values within the body.# Exact match ?res tuner:jsonPath ( "$.foo" "bar" ) . # Custom assertion (e.g. regex, contains, math) ?res tuner:jsonPath ( "$.baz" "42" string:contains ) .
⚠️ Note that unlike RDF assersion, JSON Path assertions are used on the response itself, without usingtuner:body.
Use tuner:assertThat to fail a test with a custom message if a condition is not met.
{ ?value math:greaterThan 10 }!tuner:assertThat "Value should be greater than 10" .Print messages to the console during test execution (depending on log level).
?message^tuner:info: Prints an INFO message.?message^tuner:trace: Prints a DEBUG message.
Example:
"Starting request"^tuner:info .() file:temp ?path: Generates a temporary file path.?path file:rm ?any: Deletes a file.?relative file:libPath ?absolute: Resolves a path relative to theapi-tunerlibrary.
Setting the --debug flag will print verbose response information. The --raw flag will print
the raw triples produced by the n3 rules.
Additionally, you can inspect the raw response files, which are written to the system's temp directory. The are prefixed
with api-tuner. Thus, you can list them with ls -l "${TMPDIR:-/tmp}"/api-tuner*, or upload to CI artifacts, as shown
in the GitHub Workflow step example below.
- run: npx api-tuner ...
env:
TMPDIR: ${{ runner.temp }}
- if: failure()
name: upload api-tuner response data
uses: actions/upload-artifact@v7
with:
name: api-tuner-debug
path: '${{ runner.temp }}/api-tuner*'