Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The below configuration details needs to be added in the application yaml of both publisher(cps-core) and consumer(cps-temporal) of the events published to Kafka. These configuration should be defined in application-helm.yaml included in the OOM charts to provide flexibility while deploying the application. The environment variables could also be replaced by override values.

spring:

    kafka:
bootstrap-servers: ${KAFKA_BOOTSTRAP_SERVER}
security:
protocol: ${KAFKA_SECURITY_PROTOCOL}
ssl:
trust-store-type: ${KAFKA_SSL_TRUST_TYPE}
trust-store-location: ${KAFKA_SSL_TRUST_STORE_LOCATION}
trust-store-password: ${KAFKA_SSL_TRUST_STORE_PASSWORD}
properties:
sasl.mechanism: ${KAFKA_SASL_MECHANISM}
sasl.jaas.config: ${KAFKA_SASL_JAAS_CONFIG}
ssl.endpoint.identification.algorithm:

app:
kafka:
consumer:
topic: ${KAFKA_CONSUMER_TOPIC:cps.cfg-state-events}


Topics TODO ( Discuss with Fiachra and Bruno/Renu to understand how this is handled in both ONAP and Bell) are auto generated in ONAP DMaaP Kafka. Hence topic creation is not covered in the scope of cps.

dmaap:
message-router:
message-router-kafka:
configurationOverrides:
"offsets.topic.replication.factor": "3"
"log.dirs": "/var/lib/kafka/data"
"log.retention.hours": "168"
"num.partitions": "3"
"transaction.state.log.replication.factor": "1"
"transaction.state.log.min.isr": "1"
"num.recovery.threads.per.data.dir": "5"
"zookeeper.connection.timeout.ms": "6000"
"default.replication.factor": "3"
"zookeeper.set.acl": "true"

kafka:
plaintext:
security:
protocol: SASL_PLAINTEXT
ssl:
trust-store-type:
trust-store-location:
trust-store-password:
properties:
sasl.mechanism: PLAIN
sasl_ssl:
security:
protocol: SASL_SSL
ssl:
trust-store-type: JKS
trust-store-location: file:///C:/Users/adityaputhuparambil/ltec-com-strimzi.jks
trust-store-password: secret
properties:
sasl.mechanism: SCRAM-SHA-512
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret;

SASL Authentication

SASL authentication is supported both through plain unencrypted connections as well as through TLS connections.

...

spring:
kafka:
bootstrap-servers: ${KAFKA_BOOTSTRAP_SERVER}
security:
protocol: SASL_PLAINTEXT {{ .Values.kafka.plaintext.security.protocol }}
ssl:
trust-store-type: {{ .Values.kafka.plaintext.ssl.trust-store-type }}
trust-store-location:
trust-store-password:
properties:
sasl.mechanism: PLAIN
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret;
ssl.endpoint.identification.algorithm:


...

spring:
kafka:
bootstrap-servers: ${KAFKA_BOOTSTRAP_SERVER}
security:
protocol: SASL_SSL
ssl:
trust-store-type: JKS
trust-store-location: file:///C:/Users/adityaputhuparambil/ltec-com-strimzi.jks
trust-store-password: secret
properties:
sasl.mechanism: SCRAM-SHA-512
sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username="adminr" password="admin_secret";
ssl.endpoint.identification.algorithm:
producer:
# Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers
# See https://docs.spring.io/spring-kafka/docs/2.5.11.RELEASE/reference/html/#error-handling-deserializer
# and https://www.confluent.io/blog/spring-kafka-can-your-kafka-consumers-handle-a-poison-pill/
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: org.onap.cps.event.model.CpsDataUpdatedEvent