You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Current »

Note: This page is for discussion and aims to define the possible requirements for a simulator wrapper within  pythonsdk-tests

Andreas Geissler Michał Jagiełło Illia Halych Krzysztof Kuzmicki


When we launch E2E tests, we sometimes need to launch third party simulator (e.g. pnf-simulator).

It would be a cool feature if we could have a wrappers within the pythonsdk-test to be able to start/stop/configure/launch REST API as a prerequisite steps.

It obviously depends of the simulators...and their ability to be controled and offer a REST API for the test.

We may however imagine to define an API that will allow the pythonsdk-test to consume such simulators.

Any simulator

  • shall be hosted in dedicated repository (do one thing and do it well)
  • shall include a docker build chain and be runnable trough a docker run command
  • shall be available in the ONAP Nexus

Though the docker standard command, it shall be able to

  • start the simulator
  • start with specific configuration
  • stop the simulator

If the feature is available, a REST API shall be available to reconfigure/trigger workflow.

The wrapper step within the pythonsdk-tests shall

  • detect if the simulator is available or not (if not test shall fail immediately exception simulatorNotAvailable)
  • launch the simulator
  • get the simulator status
  • stop the simulator
  • exchange with the simulator if an API available

Open questions

use of submodule from pythonsdk to know the available simulators? customize configuration?


The "scalable steps" solution

Concept:  INT-1812 - Getting issue details... STATUS

Implementation:  INT-1829 - Getting issue details... STATUS

Author: Illia Halych

Problem: all simulators are different, no trivial solution for all.

Challenges:

  1. Include general functionalities, specific functionalities, abstractions.
  2. Make it flexible to extend the wrapper based on specific needs.
  3. Patch custom configurations.
  4. Cleanup when something fails.

Concept:

  1. Use a step-by-step execution that's already available in pythonsdk-testsand implement the simulator as a step-by-step process.
  2. The step-by-step process execution will allow to patch configurations to each independent step separately before they start.
  3. The step-by-step process execution will allow to "rollback" (cleanup) from the step where the problem occurred.
  4. The step-by-step process execution is capable of changing the order of steps and dropping certain steps. 
  5. The first (1) basic step has the lowest level of abstraction - most common functionalities for all.
  6. The zero (0) basic step would be a substitution for the first step (1) for complex models.
  7. The third (3) basic step has the highest level of abstraction - least common functionalities for all.

Solution:

  1. Simulators' images and other setup are described in helm charts. It is advised that each simulator's repo contains these charts, so we can avoid duplicated an inconsistency. 
  2. Helm charts may be stored in a remote repository as well as a local folder. Remote is preferred to avoid duplicate code.
  3. For Pyhelm Tiller's HOST address on the target machine is required. Tiller may run as a background process which can be addressed by localhost. Other ways are possible too.
  4. Pyhelm supports Helm v2 and according to Guilin release versions it is sufficient. Though Pyhelm was tested on the machine with Helm v3 and that ran without problems.
  5.  STEP LEVEL 2 takes HTTP/HTTPS request configurations during init.






  • No labels