Skip to Main Content

QA Automation in your APIs with artificial intelligence

The integration of Artificial Intelligence (AI) into the API management process is enabling the optimization of development cycles, reducing times and generating more robust solutions.

One of the areas where Artificial Intelligence can add value to our API management lifecycle is in quality assurance, allowing us to automatically generate the necessary mocks with their corresponding tests, which ensure that an API is functioning correctly based on its specification.

In this article, we will see how we can generate the necessary files to automate the testing flow of our API from an OpenAPI specification.

What is OpenAPI?

OpenAPI (OpenAPI Specification or OAS) is an open standard maintained by the OpenAPI Initiative, part of the Linux Foundation, which is used to describe and document a RESTful service, allowing humans and processes to understand the capabilities of that service.

What is Wiremock?

Wiremock is an open-source tool that simulates services based on the HTTP protocol.

This allows us to create predefined responses (stubs) from our APIs for later integration with API Automation testing tools.

What is a Postman collection?

A Postman collection is a set of requests from an HTTP-based service, including all the necessary components (parameters, headers, message body, etc.).

An important feature is that it allows us to include test scripts in requests to validate the behavior of our services based on the responses obtained.

What is Newman?

Newman is an open-source tool that allows us to run Postman collections from the command line, as well as integrate them into a continuous integration flow.

What is n8n?

n8n is a visual platform that allows us to connect multiple heterogeneous data sources to create automated workflows without the need to implement code (no-code).

– Would you like to know more? Check out our article on: How to identify the maturity level of your institution’s APIs and their needs– 

Implementing the workflow in n8n

We will implement the following workflow to automate the generation of our tests from an OpenAPI specification:

  1. Analyze our API specification in OpenAPI using Google Gemini AI (as well as other AI platforms such as OpenAI, Anthropic, or Azure AI) to obtain the different paths, methods, and response codes.
request

“text”: “Generate json result from the following OpenAPI specification. The output should follow the provided schema. The OpenAPI specification is: {{ $json[‘petstore-swagger’].json }}”

         2. Convert the result into a JSON object so that it can be processed more efficiently

[

  {

    “path”: “/pet”,

    “method”: “put”,

    “response_codes”: [

      {

        “code”: “200”

      },

      {

        “code”: “400”

      },

      {

        “code”: “404”

      },

      {

        “code”: “422”

      },

      {

        “code”: “default”

      }

    ]

  },

  {

    “path”: “/pet”,

    “method”: “post”,

    “response_codes”: [

      {

        “code”: “200”

      },

      {

        “code”: “400”

      },

      {

        “code”: “422”

      },

      {

        “code”: “default”

      }

    ]

  },

  {

    “path”: “/pet/findByStatus”,

    “method”: “get”,

    “response_codes”: [

      {

        “code”: “200”

      },

      {

        “code”: “400”

      },

      {

        “code”: “default”

      }

    ]

  },

         3. Iterate each path/method/code tuple to obtain a Wiremock stub that complies with our API specification for that tuple and a Postman collection, with its corresponding test scripts, allowing us to verify the correct functioning of that specific tuple. For performance testing or more complex loads, tools such as JMeter or Gatling can be used as needed.

{ “text”: “I need a WireMock stub for the specified resource, method and response code. The stub needs to respond to JSON based on the OpenAPI definition. I need to include also the input parameters” },

{ “text”: “I need a Postman collection for the specified resource, method and response code. The collection must have a test script to validate the response you previously configured in the Wiremock stub. The Postman collection base_url variable must be defined with the server in the OpenAPI definition. The Postman collection _postman_id attribute must be set to {{ $json.method }}-{{ $json.path }}-{{ $json.response_codes.code }}. The collection does not need a response example. Please, verify that the Postman collection you generate is a valid JSON” }

         4. Convert the result into a JSON object so that it can be processed more efficiently

[

  {

    “response”: {

      “wiremock_stub”: “{\”request\”:{\”method\”:\”GET\”,\”urlPath\”:\”/pet/findByStatus\”,\”queryParameters\”:{\”status\”:{\”equalTo\”:\”available\”}}},\”response\”:{\”status\”:200,\”headers\”:{\”Content-Type\”:\”application/json\”},\”jsonBody\”:[{\”id\”:1,\”name\”:\”doggie\”,\”category\”:{\”id\”:1,\”name\”:\”Dogs\”},\”photoUrls\”:[\”http://example.com/photos/doggie1.jpg\”],\”tags\”:[{\”id\”:10,\”name\”:\”cute\”}],\”status\”:\”available\”},{\”id\”:2,\”name\”:\”kitty\”,\”category\”:{\”id\”:2,\”name\”:\”Cats\”},\”photoUrls\”:[\”http://example.com/photos/kitty1.jpg\”,\”http://example.com/photos/kitty2.jpg\”],\”tags\”:[{\”id\”:20,\”name\”:\”fluffy\”}],\”status\”:\”available\”}]}}”,

      “postman_collection”: “{\”info\”:{\”_postman_id\”:\”get-/pet/findByStatus-200\”,\”name\”:\”Swagger Petstore – OpenAPI 3.0\”,\”description\”:\”This is a sample Pet Store Server based on the OpenAPI 3.0 specification.  You can find out more about Swagger at [https://swagger.io](https://swagger.io). In the third iteration of the pet store, we’ve switched to the design first approach! You can now help us improve the API whether it’s by making changes to the definition itself or to the code. That way, with time, we can improve the API in general, and expose some of the new features in OAS3.\\n\\nSome useful links:\\n- [The Pet Store repository](https://github.com/swagger-api/swagger-petstore]\\n- [The source API definition for the Pet Store](https://github.com/swagger-api/swagger-petstore/blob/master/src/main/resources/openapi.yaml)\”,\”schema\”:\”https://schema.getpostman.com/json/collection/v2.1.0/

         5. Run the generated tests

Once we have the Wiremock stub and the associated Postman collection, we can load it into our local environment and run a test.

It is worth mentioning that although this process is performed manually in this article, it can be easily automated within our continuous integration pipeline using tools such as GitHub Actions, GitLab CI, or Jenkins:

  1. Import the OpenAPI specification into the API Manager (in our case, we are using the Gravitee.io solution)

         2. Load the stub into the Wiremock instance

         3. Launch our postman collection (we will use the newman tool)

newman run postman-collection-petstore-pet-get-200.json –insecure –global-var “base_url=https://api.chakray.local/swaggerpetstore”
Swagger Petstore – OpenAPI 3.0

→ Finds Pets by status

  GET https://api.chakray.local/swaggerpetstore/pet/findByStatus?status=available [200 OK, 591B, 66ms]

  ✓  Status code is 200

  ✓  Content-Type header is application/json

  ✓  Response is an array

  ✓  Each item in the array is a valid Pet object

┌─────────────────────────┬──────────────────┬──────────────────┐

│                         │         executed │           failed │

├─────────────────────────┼──────────────────┼──────────────────┤

│              iterations │                1 │                0 │

├─────────────────────────┼──────────────────┼──────────────────┤

│                requests │                1 │                0 │

├─────────────────────────┼──────────────────┼──────────────────┤

│            test-scripts │                1 │                0 │

├─────────────────────────┼──────────────────┼──────────────────┤

│      prerequest-scripts │                0 │                0 │

├─────────────────────────┼──────────────────┼──────────────────┤

│              assertions │                4 │                0 │

├─────────────────────────┴──────────────────┴──────────────────┤

│ total run duration: 109ms                                     │

├───────────────────────────────────────────────────────────────┤

│ total data received: 161B (approx)                            │

├───────────────────────────────────────────────────────────────┤

│ average response time: 66ms [min: 66ms, max: 66ms, s.d.: 0µs] │

└───────────────────────────────────────────────────────────────┘

Conclusion

In this article, we have created a workflow that allows us to integrate an artificial intelligence assistant (Google Gemini) into our API management lifecycle, with the aim of automating the testing process by generating tests based on an OpenAPI specification.