It depends what you mean by use cases...

we may distinguish 2 main categories

  • use cases created for a release
  • use cases integrated in CI

Release use cases

The use cases associated with a release must be described in the official documentation.

See the list of these use cases from the previous versions:

Please note that, if not automated, these use cases may be closely linked to a given release and not maintained over time.

Integration can maintain only the automated use cases not dealing with proprietary VNFs/tooling.

You may find lots of material in the wiki...but the wiki is not an official source, so it can be misleading and not up to date.

Automated use cases

These use cases are integrated by the Integration team in the different CI/CD chains.

If they have been automated, they must have been declared in the official test DB: http://testresults.opnfv.org/onap/api/v1/projects

Integration team maintain the integration and security tests.

You can list these tests thanks to the API

projectlinktests
integrationhttp://testresults.opnfv.org/onap/api/v1/projects/integration/cases
  • core
  • small
  • medium
  • full
  • healthdist
  • postinstall
  • basic_vm
  • freeradius_nbi
  • clearwater_ims
  • pnf-registrate
  • onap-k8s
  • onap-helm
  • hv-ves
  • 5gbulkpm
  • nodeport_check_certs
  • basic_network
  • basic_cnf
  • ves-collector
  • cmpv2
securityhttp://testresults.opnfv.org/onap/api/v1/projects/security/cases
  • root_pods
  • unlimitted_pods
  • cis_kubernetes
  • http_public_endpoints
  • nodeport_ingress
  • jdpw_ports
  • kube_hunter
  • nonssl_endpoints
  • versions

The code of these tests may be distributed everywhere....

in upstream repositories, in internal ONAP repositories, they can be written in different languages, using different frameworks...but, as we are using xtesting framework to unify the way to launch them. You can get all the information thanks to the Dockerfile of the xtesting dockers (https://gerrit.onap.org/r/admin/repos/integration/xtesting).

  • No labels