The TSC has decided to launch a task to define how to improve Pair-Wise Testing for Dublin as defined in TSC-42

Current situation

For Beijing Release at Santa-Clara event (Dec, 2017), it was decided to perform pair-wise tests to achieve S3P.

For Casablanca Release, the projects had to perform pair-wise testing for RC0 milestone. Pair-wise testing has to be realized only OOM installation.

Various projects defined their own tests and run them when possible with no formal recommendation.

PTL must declare the various tests in the Integration Board: Integration Weather Board for Casablanca Release



The problems

  • Dependencies problem. To realize pair-wise testing, the two paired components must be ready. For Casablanca, some components were available late impacting the time to perform pair-wise tests.
  • No formal explanation to explain what to test
  • No formal way to describe the tests
  • No test storage
  • No consistent tests: Some projects perform tests in both ways, but there is not a global consistency (i.e. NBI tested SO - NBI, while SO did not test SO - NBI)


What to test?

Each ONAP component includes a large number of Docker containers. Each project must be responsible to perform unitary test to validate the interaction between these Docker containers and to validate the API offered by the components (e.g. SO is responsible to test the various offered API).

The main objective is to test that both components are interacting correctly (e.g. using the correct ports, protocol, API version...).


When to perform pair-wise tests?

Pair-wise testing must be executed between M4 and RC0 to validate functionalities between 2 components.

Performing tests manually between M4 and RC0 is not optimal. Automate pair-wise tests as much as possible and replay tests to guarantee that there is no unexpected behavior when a component updates the code.

Identify a list of 'core' projects that must be ready before others to avoid last minute tests will put high pressure on core components => define pair-wise tests to be run on a daily basis after RC0.


Draft Proposed recommendations:

  • Define formally the component dependencies for each project (c.f. NBI proposal) by project
  • Run CSIT on OOM installation by project using the OOM capabilities to launch a unique component by override the onap/value.yaml file => OOM & Integration team to provide the guidelines
  • Develop a tool to verify that pair-wise tests are defined based on component dependencies  by integration
  • Define pair-wise tests by M4 by project
  • Run pair-wise tests under Integration project on daily basis after M4 (e.g. deploy a full fresh ONAP and run the pair-wise tests) by integration
  • Store tests in a DB by integration. Tests could also be store in a git repo, so they are easily accessible.




2 Comments

  1. Hi Eric Debeau - Thanks for this initiative.

    Lot of pairwise test cases could be covered under usecase executions flows. It will be better td group the pairwise case per usecase and built CSIT around them.

    Also current way of tracking pair-wise per project introduces a gap where same dependencies (interface) are not being tracked consistently on interfacing projects. Central ONAP level view and tracking on all the pair-wise cases will help on accurate reporting.

  2. from 

    TSC 2018-12-20

    From Me to Everyone: (10:22)
    
CD done on OOM via helm charts for 1 or more components before merge by gerrit magic word - in progress with 3 LF personnel - we are at the paying for a target VM to host a minimal k8s/oom cluster right now - usually friday at 10 meets

    From Me to Everyone: (10:22)
    TSC-25

    
above can run automated CSIT

    We also have the onapci.org CD and http://kibana.onap.info:5601/app/kibana#/dashboard/AWAtvpS63NTXK5mX2kuS CD pocs we can leverage for CD