You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

This document relates to investigative work being carried out on the Jira ticket POLICY-3809. This work specification is in response to requirements set out by IDUN for integrating the policy framework kubernetes pods / helm charts into their system. The general requirements of the investigation are below:

  • How to create a Kubernetes environment that can be spun up and made available on demand on suitable K8S infrastructure
  • How to write suitable test suites to verify the functional requirements below would be developed.
  • How such test suites could be done using "Contract Testing".

Functional Requirements Detail

Note that in Postgres, many of the features below are available. In the verification environment, we want to verify that the Policy Framework continues to work in the following scenarios:

  • Synchronization and Load Balancing
  • Failover
  • Backup and Restore

In addition the environment should:

  • Support measurement of Performance Lag
  • Use secure communication towards the Database
  • Verify that auditing of database operations is working


Investigated Testing Approaches

This section will outline some of the approaches to tests that are commonly used but also some unique/less common approaches

Chart Tests

Chart tests are actually built into helm and detail on them can be found here: https://helm.sh/docs/topics/chart_tests/. The task of a chart test is to verify that a chart works as expected once it is installed. Each helm chart will have a templates directory under it. The test file contains the yaml definition of  a Kubernetes Job. A Job in Kubernetes is basically a resource that creates a Pod that carries out a specific task. Once the task is executed, the Job deletes the pods and exits. In the test, the Job runs with a specified command and is considered a success if the container successfully exits with an (exit 0).

Examples:

  • Validate that your configuration from the values.yaml file was properly injected.
    • Make sure your username and password work correctly
    • Make sure an incorrect username and password does not work
  • Assert that your services are up and correctly load balancing
  • Test successful connection to a database using a specified secret

The simplicity of specifying tests in this way is a major advantage. Tests can then simply be run with a "helm test" command.

Helm Unit Test Plugin

There is an open source project that has been defined and is present on GitHub - https://github.com/quintush/helm-unittest. It can be installed easily as it is designed as a helm plugin. The plugin allows definition of tests in yaml to confirm basic functionality of the deployed pod/chart. It is operated very simply. You can define a tests/ directory under your chart e.g. YOUR_CHART/tests/deployment_test.yaml. Then an example test suite is defined below:

suite: test deployment
templates:
  - deployment.yaml
tests:
  - it: should work
    set:
      image.tag: latest
    asserts:
      - isKind:
          of: Deployment
      - matchRegex:
          path: metadata.name
          pattern: -my-chart$
      - equal:
          path: spec.template.spec.containers[0].image
          value: nginx:latest

The test asserts a few different things. The template is a Deployment type, the name of the chart and the container used. Simple cli command is then used to run the test.

helm unittest $YOUR_CHART

Although this library is useful, it does not actually serve to test the functionality of the chart, only the specification.


Octopus

The Kyma project is a cloud native application runtime that uses Kubernetes, helm and a number of other components. They used helm tests extensively and appreciated how easy the tests were to specify. However, they did find some shortcomings:

  • Running the whole suite of integration tests took a long time, so they needed an easy way of selecting tests they wanted to run.
  • The number of flaky tests increased, and they wanted to ensure they are automatically rerun.
  • They needed a way of verifying the tests' stability and detecting flaky tests.
  • They wanted to run tests concurrently to reduce the overall testing time.

For these reasons, Kyma developed their own tool called Octopus and it tackles all of the issues above: https://github.com/kyma-incubator/octopus/blob/master/README.md

In developing tests using Octopus, the tester defines 2 files

  • TestDefinition file: Defines a test for a single component or a cross-component scenario. We can see in the example below that the custom TestDefinition resource is used to define a Pod with a specified image for the container and a simple command is carried out. This is not dissimilar to the way that helm test defines tests for the charts.
TestDefinition
apiVersion: testing.kyma-project.io/v1alpha1
kind: TestDefinition
metadata:
  labels:
    component: service-catalog
  name: test-example
spec:
  template:
    spec:
      containers:
        - name: test
          image: alpine:latest
          command:
            - "pwd"

ClusterTestSuite file: This file defines which tests to run on the cluster and how to run them. In the below example, they specify to run only tests with the "service-catalog" label. It specifies how many times a test should be executed and how many retries of the test should be done. Also, concurrency is specified to define what the maximum number of concurrent tests should be running.

ClusterTestSuite
apiVersion: testing.kyma-project.io/v1alpha1
kind: ClusterTestSuite
metadata:
  labels:
    controller-tools.k8s.io: "1.0"
  name: testsuite-selected-by-labels
spec:
  count: 1
  maxRetries: 1
  concurrency: 2
  selectors:
    matchLabelExpressions:
      - component=service-catalog

Although this project seems to make some improvement on the helm chart tests, it is unclear how mature the project is. Documentation details how to define the specified files and how to use kubectl CLI to execute a test - https://github.com/kyma-incubator/octopus/blob/master/docs/tutorial.md




  • No labels