Beijing Feedback

In order to plan for the proper Platform Maturity requirements and levels for Casablanca, we would like to gather feedback from the community.  We welcome feedback from all, but especially:

  • Project Teams (PTL or representative)
  • ONAP Operators

As a reminder, the Beijing requirements can be found at Platform Maturity Requirements (aka Carrier Grade).  The summary of priorities and levels are at Platform Maturity Level proposal 13Dec2017v2.pdf

Project Team Feedback Requested

The type of feedback we would welcome from project teams includes:

  • What worked well in Beijing?
  • What could be improved?
  • Where could you use help in platform, tools, education?
  • Specific feedback on any particular requirement areas?

Operator Feedback Requested

From operators, we would welcome the following feedback:

  • Which platform maturity requirement areas are important to you in implementing ONAP?
  • Are there requirements not currently included that you would like to see included?


Please leave your feedback as comments to this wiki page.

THANK YOU!


Security Subcommittee Feedback


Architecture Subcommittee Feedback (from Vancouver Session)

  • Stability: 72 hours platform level soak with random transaction. (not just component-level, such as we did for Beijing). TSC has to approve this. We have to understand the resources needed for this, as it may require more lab resources.
  • Resiliency: level 3 for run-time projects (proposal). N/C for design-time projects
  • Security: see above from Security Subcommittee
  • Scalability: stay level for 1 run-time. Horizontal scaling up and down.
  • Manageability: level 2 for upgrade a single cpt (1 cpt at a time) with no loss of data.
  • Usability: more discussion with wider group needed

Presentation for the discussion session held 20June2018 in Beijing.  


Presentation as presented to TSC on 21June2018:


Updates to presentation based on feedback from the TSC:


  • No labels

3 Comments

  1. Some comments sent via e-mail (Catherine Lefevre Roger Maitland etc):

    • Resiliency testing has shown long recovery times, which appear to be related to scheduling the containers on new nodes
    • Using Kubernetes resource requests and limits will help address this issue - OOM has set these up in the templates already
    • Pod affinity and anti-affinity will also help & should be pursued in Casablanca - these are also in the templates created by OOM

    Helen Chen also added:

    Changing configuration and process for K8S / OOM deployment: with the best practice guidance from OOM, each project should handle their own project specific charts. The current issues are:

    1. All the testing and code review are soley on OOM team, which makes them so overloaded
    2. Project owners know their project most to “tune” the configuration to get a better performance for their project.

    Jason Hunt asks: these topics seem to be leading to a need for an overall view on capacity planning.  How big should the environment be to support all the ONAP components in a resilient fashion?  Are there various deployment sizes that we need to consider (i.e. dev setup with no resiliency, lab environment, scale-out production environment, multi-region)??  

  2. Input from Adolfo Perez-Duran on Usability requirements:

    1. Consider expanding usability requirement by defining at least two categories of "users."

    Category 1: Users that see ONAP as a platform: operations teams in telecom operators, VARs and system integrators

    Category 2: ONAP developers

    2. Expand usability metrics as follows

    • ONAP User (operator, VARs, integrators)
      • Level 1 
        • Deployment and platform administration
          • Documentation is available
          • Deployment tutorial available
        • Service design and deployment
          • Documentation available
          • Service design and deployment tutorial available
      • Level 2
        • ONAP Platform can be deployed on different platforms (os, cpu architecture)
        • ONAP can be deployed in less than x hours
        • External API documentation available
        • Service discovery and registration available ( to add and use external controllers and applications )
    • ONAP Developer (developer, tester, technology vendors)
      • Level 1
        • API documentation
        • Adherence to coding guidelines
        • Consistent UI across ONAP components
      • Level 2
        • Adherence to API design guidelines
        • Adherence to standard data model (when applicable)
        • Usability testing conducted
        • Tutorial documented

    You can think of ONAP as a platform or as a set of projects/components/services. Other partitions are certainly possible but these address two key categories of user personas.

    "Users" of ONAP as a platform: concerned with the deployment and management (activation, updates, monitoring, etc) of ONAP as a solution. What they do is in way similar to what the OOM team does today. These users are not concerned, for example, with development practices or coding guidelines.

    ONAP developers: although some developers/projects will be interested in deploying a complete ONAP instance, most projects focus on a few components, they do care about code quality. They may be more interested in adherence to recommended practices for API design, API documentation, code development and testing,  standard models.

    3. Tie usability to Casablanca's theme of deployability

    Usability is closely related to, and should be aligned with, the theme of the Casablanca release: deployability. By definition ONAP must be vendor agnostic. By implication, ONAP must be infrastructure and compute platform agnostic ( different cloud providers, different OSs, different CPU architectures, etc). 

  3. From an operator perspective, the platform made considerable strides in the S3P non-functional dimensions that were required for the ONAP Beijing Release.  For Casablanca, projects should be encouraged to “level-up” across the S3P dimensions, but leveling-up should be treated as a stretch goal for each PTL to allow them to balance committments across all development needs.

    Each project should be required to maintain at least their Beijing-passing level in each S3P dimension, so the platform does not regress (e.g., with the addition of new code, Unit Test Coverage can not fall below the Beijing-passing level of 50%).

    Beyond that, we should consider enhancing the existing S3P dimensions in a couple of ways:

    1. Given that the proposal for “Versioning & API Documentation Guidelines for Casablanca” was reviewed at last week’s TSC meeting, we should include adherence to the ONAP API Common Versioning Strategy (CVS) as either part of the Usability S3P dimension, or as a new standalone dimension to be measured (perhaps call it Maintainability). I’d propose we measure 3 levels of adherence, with Level 1 required for all projects in Casablanca

      1. Level 1 = All new API’s developed by a project in the Casablanca release adhere to the ONAP API Common Versioning Strategy and Documentation Guidelines

      2. Level 2 = All new API’s and all existing API’s that are modified by a project in the Casablanca release adhere to the ONAP API Common Versioning Strategy and Documentation Guidelines

      3. Level 3 = All API’s for a given project adhere to the ONAP API Common Versioning Strategy and Documentation Guidelines


    2. A new S3P Dimension for Access Control should be added (or the existing “Security” dimension extended) to drive consistency across the platform and support for fine-grained authorization.  To achieve this, all projects should be strongly encouraged to integrate the CADI framework into their component (CADI is integrated with AAF by design). Depending on feasibility, it may need to be considered a stretch goal for Casablanca rather than a hard-requirement.


    3. The Manageability S3P dimension should be enhanced to include component adherence to the ONAP Application Logging Specification v1.2 (Casablanca).  For consistency of components across the platform, all projects should be strongly encouraged to fully adhere to the spec.  However, it may be best to treat 100% adherence as a stretch goal for Casablanca and establish a method for measuring adherence in this release.