You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 43 Next »

Project Name:

  • Proposed name for the project: Integration
  • Proposed name for the repository: integration

Project description:

System Integration and Testing is responsible for ONAP cross-projects system integration and all related end-to-end release use cases testing with VNFs necessary for the successful delivery and industry adaption of the ONAP project as a whole.

Scope:

It provides all the cross-project infrastructure framework and DevOps toolchain (Continuous Integration, etc.), code and scripts, best practice guidance and benchmark, testing reports and white paper related to:

–Cross projects Continuous System Integration Testing (CSIT)
–End-to-end release use cases testing with VNFs with repeatability
–Service design for end-to-end release use cases
–Release delivering of the ONAP project
–Open Lab: building and maintenance community integration labs
–Build reference VNFs that can be used to show how the ONAP platform manages VNFs installation and lifetime management.
–Continuous Distribution (CD) to ONAP community integration labs



Category

Description

Problem Being Solved

1

Test

  • Automated testing infrastructure and tools
    • CSIT: testing of individual ONAP microservices and small collections of ONAP microservices supported by mocked services as necessary
    • End-to-End (ETE) test flows using a full ONAP deployment
  • Code and tools for automatic system testing and continuous integration test flows across ONAP projects
  • Common guidelines, templates, generic tools, infrastructure, and best practices to help project developers to write unit and system test code
  • •Automate the building artifacts/binaries to minimize human errors and reduce engineering costs
•Ensure that changes in one project will not break the functionality of other projects
•Assure that the entire ONAP project/product functions correctly in the case of continual change in subprojects
•Ensure consistency in unit and system testing methodology across all the ONAP projects
•Capture security issues

2

CI Management (ci-management repo)

  • Scripts and definitions for build and CI jobs in Jenkins
    • includes any docker build jobs for mock/simulated services
    • excludes docker build jobs for ONAP components (assumed to be handled by the ONAP Operations Manager project)
•Required to support the executing of CI jobs (e.g. for Jenkins)

3

Autorelease

  • Define community-wide artifact versioning and release strategy
  • Scripts and Jenkins job definitions to build the artifacts/binaries (e.g. zip/tar.gz files) that are used in the release candidates and final release
  • Detect/resolve cross-project compilation dependency issues
  • Generate release candidates and final release artifacts

4

Distribution (This will use OAM ONAP Project)

(descoping since comm serv1 has something like this, should keep these project be separated or merge?)

  • Current decision is to go with docker images as the primary distribution method
  • Docker builds and images are assumed to be handled by the ONAP Operations Manager project.
N/A

Packaging (This will use OAM ONAP Project)

(checking with Common serv 1)

  • Current decision is to forgo any deb or RPM packages, and go with docker images as the primary distribution method
  • Docker builds and images are assumed to be handled by the ONAP Operations Manager project.
N/A

6

S3P

•Test cases for performance, scalability, resilience/stress testing, longevity
•Benchmarking and performance whitepapers
•Define standard S3P testing metrics
•Provide and publish benchmarking results

7

Infrastructure Specification

•Develop the specifications for the “ONAP compliant” deployment and test environment
•Assist the planning and procurement of the necessary hardware and infrastructure for setting up ONAP environments

8

Bootstrap

•A framework to automatically install and test a set of base infrastructure components for new developer
•Reduce the barrier of entry to allow new ONAP developers to ramp up onto active development quickly
•Reduce the cost to the community in responding to simple environment setup questions faced by new developers

9

End-to-end release use cases testing with VNFs with repeatability

•Create automatic test cases and script for VF testing
•Perform VF compliant testing and verification using tools provided by ONAP
•Delivery the testing reports and whitepaper
•Assist define the testing metrics
•Reduce adoption risks for end-users

10

Open Lab

•Scripts and definitions for setting up a POC sample deployment of use cases in lab settings
•Provisioning, installation, and setup of all the telco equipment such as switches, routers, and gateways to enable end to end testing
•Allow remote access to the lab environment for interoperability testing
•Automatic updates of code in lab environment from future releases
•Support the needs of consistent, reproducible lab setup for demo and POC purposes
•Promote easy interoperability testing with different hardware devices, SDN controllers, etc.
•Automate the process of keeping the lab code up to date with the latest changes
11Reference VNFs Project (now part of the Integration Project)

Two basic VNFs, namely a virtual firewall and a virtual load balancer (with virtual DNSs), have been provided. The objectives of the project are to improve, extend and maintain the vFirewall and vLoadBalancer VNFs:

  • Allow ONAP to change vFirewall rules during execution
  • Platform independence (Rackspace, vanilla Openstack, Azure, ...)
  • Visualization tools that allow users to monitor the behavior of the reference VNFs as well as the effect of ONAP closed-loop operations against the VNFs
  • Tools that allow users to interact with the reference VNFs (e.g. alter the behavior of a VNF so as to violate predefined policies, in order to trigger ONAP closed-loop operations)
  • The goal is to build reference VNFs that can be used to show how the ONAP platform manages VNFs installation and lifetime management.
  • Reference VNFs can also be used as a means to test the platform itself, e.g. verify whether VNFs on-boarding, deployment, and ONAP closed-loop operations work.
12O-Parent
• ONAP Parent  provides common default settings for all the projects participating in simultaneous release.
•Isolate all the common external dependencies, default version, dependency management, plugin management, etc.
–Avoid duplicate/conflicting settings for each project
•Each project sets its parent to inherit the defaults from ONAP Parent
•Project level external dependencies and versions can be overridden if necessary


Testing Principles (in progress)

  • We expect test automation for all testing in scope for release 1.0.
  • Regression, Unit and Feature/Function testing should be triggered by built process
  • All testing must be able to execute on the selected ONAP environments
  • Unit Testing for any project should have at least 30% code coverage
  • Any new feature should be delivered with its associated unit tests/feature tests

Testing Roles and Responsibilities (in progress)

Types of TestingDev. TeamCSIT TeamE2E TeamS3P Team
Unit Testingx


Feature/Functional Testingx


Integration/Pair-Wise Testing
x

End-to-End Testing

x
Regression Testingxxxx
Performance Testing


x
Acceptance Testing
xx
Usability Testingx


Install/Uninstall Testingx


Recovering Testing

xx
Security Testing


x
Stability Testing


x
Scalability Testing


x
Application Testingx



Testing Terminology

Unit Testing (UT) – Unit testing focuses on individual software components or modules. Typically, UT is done by the programmers and not by testers, as it requires detailed knowledge of the internal program design and code; UT may require developing test driver modules or test harnesses. Code coverage is also one of the objectives of UT.
Feature/Functional Testing – Feature/Functional testing, unlike unit testing, focuses on the output as defined per requirement (user story). This type of testing is black-box type testing, geared towards functional requirements on an application basis.
Integration/Pair-Wise Testing – Integration/Pair-wise testing integrates all of the modules of the solution in a single environment to verify combined functionality after integration. It ensures everything comes together and there is end-to-end communications between all the integrated elements.
End-to-End Testing – End-to-End testing involves testing of a complete application environment in scenarios that closely mimic real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems, if appropriate..
Regression Testing – Testing the application as a whole for the modification in any module or functionality. Typically, automation tools are used for regression testing since it is difficult to cover all aspects of the system in a manual fashion.
Performance Testing – Performance testing tests the solution to see what throughput levels can be achieved based on the given platform. Performance testing term is often used interchangeably with ‘stress’ testing (pushes the system beyond its specifications to check how and when it fails) and ‘load’ testing (which checks the system behavior under steady load.). An objective of this testing is to verify that the system meets performance requirements. Different performance and load tools  are used for this type of testing. [Note: Capacity testing and performance testing are closely aligned. Capacity testing tests against product requirements to ensure the system meets requirements. Performance testing pushes the system to highest numbers it can achieve before it becomes unstable or success rate is unacceptable.]
Acceptance Testing -Normally this type of testing is done to verify if system meets the customer specified requirements. User or customer do this testing to determine whether to accept application.
Usability Testing – Usability testing is focused on testing of user interfaces (UI’s). It checks user-friendliness along with application flows; e.g., can new users understand the application easily, is proper help documented whenever users get stuck at any point, etc…. Basically system navigation is checked in this testing.
Install/Uninstall Testing – This category of testing validates full or partial install/uninstall, and upgrade and rollback processes on different operating systems under different hardware, software environment including backup/restore mechanisms.
Recovery Testing – This category of testing validates how well a system recovers from crashes, hardware failures, or other catastrophic problems in various configurations such as HA and Geo-Redundancy.
Security Testing – Security testing looks for vulnerabilities in the software that would make the system susceptible to malicious users.  It tests how well the system is protected against unauthorized internal or external access. It checks if system and databases are safe from external attacks.
Stability Testing – This type of testing validates that the system can run in steady state for an extended period of time without any system downtime or crashes. Typical stability test is run for 72hrs. If any problems are encountered during the stability test, such as application crash, high failure rates, the test is stopped and issue is investigated. The stability test is restarted after a fix is identified.
Scalability Testing – This type of testing is an extension of performance testing. The purpose of scalability testing is to validate that the system can scale efficiently by identifying major workloads and mitigate bottlenecks that can impede the scalability of the application.
Application Testing – Tests the platform in the context of  a particular application.

Architecture Alignment:

  • How does this project fit into the rest of the ONAP Architecture?
  • What other ONAP projects does this project depend on?
    • All ONAP projects
  • How does this align with external standards/specifications?

  • Are there dependencies with other open source projects?
    • Robot
    • Jenkins
    • OpenStack
    • Docker

Resources:

  • Primary Contact Person
  • Names, gerrit IDs, and company affiliations of the committers
Name
Gerrit ID
Email
Company Affiliation
 Time Zone

Helen Chen

helenc878

helen.chen@huawei.com

Huawei

Santa Clara, CA, USA

Chengli Wang


wangchengli@chinamobile.com

China Mobile

Beijing, China

Daniel Rose


dr695h@att.com

AT&T

Middletown, NJ, USA

Steven Smokowski


ss835w@att.com

AT&T

Middletown, NJ, USA

Marco Platania


platania@research.att.com

AT&T

Bedminster, NJ, USA

Christophe Closset


cc697w@intl.att.com

AT&T

Belgium

Anael Closson


ac2550@intl.att.com

AT&T

Belgium

Hector Anapan-Lavalle


ha076r@att.com

AT&T

Middletown, NJ, USA

Xiaolong Kong


xiaolong.kong@orange.com

Orange

Orange Gardens, France

François Despres


francois.despres@orange.com

Orange

Orange Gardens, France

Yi Yang


yangyi.bri@chinatelecom.cn

China Telecom


Luman Wang


wanglm.bri@chinatelecom.cn

China Telecom


Guangmin Liu


liuguangmin@huawei.com

Huawei

Shenzhen, China

Gary Wu


gary.i.wu@huawei.com

Huawei

Santa Clara, CA, USA

Kang Xi


kang.xi@huawei.com

Huawei

Bridgewater, NJ, USA

Yang Xu


yang.xu3@huawei.com

Huawei

Bridgewater, NJ, USA

Jianwen Ai


aijianwen@huawei.com

Huawei

Shenzhen, China

Murali p


murali.p@huawei.com

Huawei


Dmitriy Andrushko


dandrushko@mirantis.com

Mirantis

CA, USA

Elhay Efrat


elhay.efrat@amdocs.com

Amdocs


Marc Volovic


marc.volovic@amdocs.com

Amdocs


Abhinav Singh


as0074150@techmahindra.com

TechMahindra


Sandeep Singh


ss0074540@techmahindra.com

TechMahindra


Jinhua Fu


fu.jinhua@zte.com.cn

ZTE


Yunlong Ying


ying.yunlong@zte.com.cn

ZTE


Yuanxing   Feng


feng.yuanxing@zte.com.cn

ZTE


Xinhui Lixinhuililxinhui@vmware.comVMwareBeijing, China
  • Names and affiliations of any other contributors
Name
Email
Company Affiliation
Time Zone

Oliver Spatscheck

spatsch@research.att.com

AT&T

Bedminster, NJ, USA

Catherine Lefevrecl664y@intl.att.comAT&TBelgium


  • Project Roles (include RACI chart, if applicable)
  • Other Information:
  • link to seed code (if applicable)

ECOMP existing repos:

      • testsuite   
      • testsuite/heatbridge   
      • testsuite/properties   
      • testsuite/python-testing-utils
      • demo
      • ci-management

OPEN-O existing repos:

      • integration
      • ci-management
      • oparent
  • Vendor Neutral

This project is vendor neutral

  • Meets Board policy (including IPR)

yes

Use the above information to create a key project facts section on your project page

Key Project Facts

Project Name: Integration

JIRA project name: integration

JIRA project prefix: integration

Repo name:

    • integration
    • demo
    • testsuite   
    • testsuite/heatbridge   
    • testsuite/properties   
    • testsuite/python-testing-utils
    • ci-management
    • oparent


Lifecycle State: incubation
Primary Contact: Helen Chen
Project Lead:
mailing list tag [integration] 
Committers:

See above

*Link to TSC approval: 
Link to approval of additional submitters: 

  • No labels