Skip to content

Draft: BDD tests PoC (Do not merge!)

Jaime Nuche requested to merge T365882 into main

This commit contains a set of changes that showcase how to add Behavior Driven Tests (BDD) to the Catalyst API.

BDD tests are a type of functional tests that verify the behavior of an application from the point of view of particular roles or stakeholders, typically a user. In the case of Catalyst, the role is that of a consumer of the REST API.

An instance of the API is started up and tested against. A full E2E test would require an actual Kubernetes cluster, provisioning one during the tests would be too slow and and expensive. On the other hand, using an external cluster for running the tests (dedicated or shared) would require extra maintenance, constraint the way the tests are written and potentially block our pipelines during an outage. Using mocks to handle our backend dependencies with K8s is a preferred alternative.

The tests showed in this commit mock our two K8s backend dependencies : the Helm client and the K8s client. This greatly simplifies the tests, allows them to define their expectations and ensure low resource consumption. The downside being that we're not testing against a real K8s cluster. Our other main backend dependency is persistence, in this case spinning up an ephemeral MySQL/Maria DB for the duration of the tests is feasible and doesn't require mocking ( note that this PoC doesn't do that yet and expects the DB to be available).

In order to be able to inject mocks, the Catalyst production code had to be significantly refactored. Interfaces were added to the backend deps to be mocked and a majority of the files had to be grouped into packages with singletons that can be injected as needed. The API's bootstrap process then injects the instances required in a particular context (production singletons for production, mocks for testing).

Note that the tests and the Catalyst instance being tested run in the same process and share their memory space.

The Ginkgo test suite in charts_test.go shows three specs that test the different expected behaviors for the /charts endpoints: getting all charts, getting a detailed view of one chart, verifying errors when a chart does not exist. These specs don't rely on the mocks.

The Ginkgo test suite in envs_test shows a single spec that verifies the workflow for a successful environment creation. In this case, the spec configures the mocks with the calls it expects to see and with the behavior required for the test to complete successfully.

Libraries used:

  • Ginkgo
  • Gomega
  • uber-go/mock

Running the two suites sequentially on my local toaster of a laptop takes ~8.8s. To run the tests, run the following from the repo root: ginkgo -r. You may need to run go mod tidy to get the new deps and install Ginkgo as per: https://onsi.github.io/ginkgo/#getting-started

A running MySQL/MariaDB instance also needs to be running on 3306, contain a DB named catalyst and its root user pass needs to be secret (you can change those values in bdd.go)

Bug: T365882

Merge request reports