You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

Developer Setup

Gerrit/Git

Quickstarts

Committing Code

# stage your changes
git add .
git commit -am "your commit message"
# commit your staged changes with sign-off
git commit -s --amend
# add Issue-ID after Change-ID
# Submit your commit to ONAP Gerrit for review
git review
# goto https://gerrit.onap.org/r/#/dashboard/self

Amending existing gerrit changes in review

# add new files/changes
git add .
# dont use -m - keep the same Issue-ID: line from original commmit
git commit --amend
git review -R
# see the change set number increase - https://gerrit.onap.org/r/#/c/17203/2

Filter gerrit reviews - Thanks Mandeep Khinda 

https://gerrit.onap.org/r/#/q/is:reviewer+AND+status:open+AND+label:Code-Review%253D0

Developer Testing

Sonar

Having trouble getting the "run-sonar" command to run sonar - it skips the modules in the pom.

Looking at verifying sonar locally using eclemma


Developer Deployment

Deployment Integrity

ELK containers

Logstash port

Elasticsearch port

# get pod names and the actual VM that any pod is on
ubuntu@ip-10-0-0-169:~$ kubectl get pods --all-namespaces -o wide | grep log-
onap          onap-log-elasticsearch-756cfb559b-wk8c6                           1/1       Running            0          2h        10.42.207.254   ip-10-0-0-227.us-east-2.compute.internal
onap          onap-log-kibana-6bb55fc66b-kxtg6                                  0/1       Running            16         1h        10.42.54.76     ip-10-0-0-111.us-east-2.compute.internal
onap          onap-log-logstash-689ccb995c-7zmcq                                1/1       Running            0          2h        10.42.166.241   ip-10-0-0-111.us-east-2.compute.internal
onap          onap-vfc-catalog-5fbdfc7b6c-xc84b                                 2/2       Running            0          2h        10.42.206.141   ip-10-0-0-227.us-east-2.compute.internal
# get nodeport
ubuntu@ip-10-0-0-169:~$ kubectl get services --all-namespaces -o wide | grep log-
onap          log-es                     NodePort       10.43.82.53     <none>                                 9200:30254/TCP                                                               2h        app=log-elasticsearch,release=onap
onap          log-es-tcp                 ClusterIP      10.43.90.198    <none>                                 9300/TCP                                                                     2h        app=log-elasticsearch,release=onap
onap          log-kibana                 NodePort       10.43.167.146   <none>                                 5601:30253/TCP                                                               2h        app=log-kibana,release=onap
onap          log-ls                     NodePort       10.43.250.182   <none>                                 5044:30255/TCP                                                               2h        app=log-logstash,release=onap
onap          log-ls-http                ClusterIP      10.43.81.173    <none>                                 9600/TCP                                                                     2h        app=log-logstash,release=onap
# check nodeport outside container
ubuntu@ip-10-0-0-169:~$ curl ip-10-0-0-111.us-east-2.compute.internal:30254
{
  "name" : "-pEf9q9",
  "cluster_name" : "onap-log",
  "cluster_uuid" : "ferqW-rdR_-Ys9EkWw82rw",
  "version" : {
    "number" : "5.5.0",
    "build_hash" : "260387d",
    "build_date" : "2017-06-30T23:16:05.735Z",
    "build_snapshot" : false,
    "lucene_version" : "6.6.0"
  }, "tagline" : "You Know, for Search"
}
# check inside docker container - for reference
ubuntu@ip-10-0-0-169:~$ kubectl exec -it -n onap onap-log-elasticsearch-756cfb559b-wk8c6 bash
[elasticsearch@onap-log-elasticsearch-756cfb559b-wk8c6 ~]$ curl http://127.0.0.1:9200   
{
  "name" : "-pEf9q9",


Kibana port


Pairwise Testing

AAI and Log Deployment

AAI, Log and Robot will fit on a 16G VM

Deployment Issues

ran into an issue running champ on a 16g VM (AAI/LOG/Robot only)
master 20180509 build
but it runs fine on a normal cluster with the rest of ONAP

19:56:05 onap onap-aai-champ-85f97f5d7c-zfkdp 1/1 Running 0 2h 10.42.234.99 ip-10-0-0-227.us-east-2.compute.internal


http://jenkins.onap.info/job/oom-cd-master/2915/consoleFull

OOM-1015 - Getting issue details... STATUS

Every 2.0s: kubectl get pods --all-namespaces                                                                                                 Thu May 10 13:52:47 2018

NAMESPACE     NAME                                     READY     STATUS        RESTARTS   AGE
kube-system   heapster-76b8cd7b5-9dg8j                 1/1       Running       0          10h
kube-system   kube-dns-5d7b4487c9-fj2wv                3/3       Running       2          10h
kube-system   kubernetes-dashboard-f9577fffd-c9nwp     1/1       Running       0          10h
kube-system   monitoring-grafana-997796fcf-jdx8q       1/1       Running       0          10h
kube-system   monitoring-influxdb-56fdcd96b-zpjmz      1/1       Running       0          10h
kube-system   tiller-deploy-54bcc55dd5-mvbb4           1/1       Running       2          10h
onap          dev-aai-babel-6b79c6bc5b-7srxz           2/2       Running       0          10h
onap          dev-aai-cassandra-0                      1/1       Running       0          10h
onap          dev-aai-cassandra-1                      1/1       Running       0          10h
onap          dev-aai-cassandra-2                      1/1       Running       0          10h
onap          dev-aai-cdc9cdb76-mmc4r                  1/1       Running       0          10h
onap          dev-aai-champ-845ff6b947-l8jqt           0/1       Terminating   0          10h
onap          dev-aai-champ-845ff6b947-r69bj           0/1       Init:0/1      0          25s
onap          dev-aai-data-router-8c77ff9dd-7dkmg      1/1       Running       3          10h
onap          dev-aai-elasticsearch-548b68c46f-djmtd   1/1       Running       0          10h
onap          dev-aai-gizmo-657cb8556c-z7c2q           2/2       Running       0          10h
onap          dev-aai-hbase-868f949597-xp2b9           1/1       Running       0          10h
onap          dev-aai-modelloader-6687fcc84-2pz8n      2/2       Running       0          10h
onap          dev-aai-resources-67c58fbdc-g22t6        2/2       Running       0          10h
onap          dev-aai-search-data-8686bbd58c-ft7h2     2/2       Running       0          10h
onap          dev-aai-sparky-be-54889bbbd6-rgrr5       2/2       Running       1          10h
onap          dev-aai-traversal-7bb98d854d-2fhjc       2/2       Running       0          10h
onap          dev-log-elasticsearch-5656984bc4-n2n46   1/1       Running       0          10h
onap          dev-log-kibana-567557fb9d-7ksdn          1/1       Running       50         10h
onap          dev-log-logstash-fcc7d68bd-49rv8         1/1       Running       0          10h
onap          dev-robot-6cc48c696b-875p5               1/1       Running       0          10h

ubuntu@obrien-cluster:~$ kubectl describe pod dev-aai-champ-845ff6b947-l8jqt -n onap
Name:           dev-aai-champ-845ff6b947-l8jqt
Namespace:      onap
Node:           obrien-cluster/10.69.25.12
Start Time:     Thu, 10 May 2018 03:32:21 +0000
Labels:         app=aai-champ
                pod-template-hash=4019926503
                release=dev
Annotations:    kubernetes.io/created-by={"kind":"SerializedReference","apiVersion":"v1","reference":{"kind":"ReplicaSet","namespace":"onap","name":"dev-aai-champ-845ff6b947","uid":"bf48c0cd-5402-11e8-91b1-020cc142d4...
Status:         Pending
IP:             10.42.23.228
Created By:     ReplicaSet/dev-aai-champ-845ff6b947
Controlled By:  ReplicaSet/dev-aai-champ-845ff6b947
Init Containers:
  aai-champ-readiness:
    Container ID:  docker://46197a2e7383437ed7d8319dec052367fd78f8feb826d66c42312b035921eb7a
    Image:         oomk8s/readiness-check:2.0.0
    Image ID:      docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>
    Command:
      /root/ready.py
    Args:
      --container-name
      aai-resources
      --container-name
      message-router-kafka
    State:          Running
      Started:      Thu, 10 May 2018 03:46:14 +0000
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Thu, 10 May 2018 03:34:58 +0000
      Finished:     Thu, 10 May 2018 03:45:04 +0000
    Ready:          False
    Restart Count:  1
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-2jccm (ro)
Containers:
  aai-champ:
    Container ID:
    Image:          nexus3.onap.org:10001/onap/champ:1.2-STAGING-latest
    Image ID:
    Port:           9522/TCP
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Readiness:      tcp-socket :9522 delay=10s timeout=1s period=10s #success=1 #failure=3
    Environment:
      CONFIG_HOME:           /opt/app/champ-service/appconfig
      GRAPHIMPL:             janus-deps
      KEY_STORE_PASSWORD:    <set to the key 'KEY_STORE_PASSWORD' in secret 'dev-aai-champ-pass'>    Optional: false
      KEY_MANAGER_PASSWORD:  <set to the key 'KEY_MANAGER_PASSWORD' in secret 'dev-aai-champ-pass'>  Optional: false
      SERVICE_BEANS:         /opt/app/champ-service/dynamic/conf
    Mounts:
      /etc/localtime from localtime (ro)
      /logs from dev-aai-champ-logs (rw)
      /opt/app/champ-service/appconfig/auth from dev-aai-champ-secrets (rw)
      /opt/app/champ-service/appconfig/champ-api.properties from dev-aai-champ-config (rw)
      /opt/app/champ-service/dynamic/conf/champ-beans.xml from dev-aai-champ-dynamic-config (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-2jccm (ro)
Conditions:
  Type           Status
  Initialized    False
  Ready          False
  PodScheduled   True
Volumes:
  localtime:
    Type:  HostPath (bare host directory volume)
    Path:  /etc/localtime
  dev-aai-champ-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      dev-aai-champ
    Optional:  false
  dev-aai-champ-secrets:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  dev-aai-champ-champ
    Optional:    false
  dev-aai-champ-dynamic-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      dev-aai-champ-dynamic
    Optional:  false
  dev-aai-champ-logs:
    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:
  default-token-2jccm:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-2jccm
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.alpha.kubernetes.io/notReady:NoExecute for 300s
                 node.alpha.kubernetes.io/unreachable:NoExecute for 300s
Events:          <none>
ubuntu@obrien-cluster:~$ kubectl delete pod dev-aai-champ-845ff6b947-l8jqt -n onap
pod "dev-aai-champ-845ff6b947-l8jqt" deleted


Developer Debugging

Developer Commits

Developer Reviews


  • No labels