...
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
kafka sasl_plaintext: security: protocol: SASL_PLAINTEXT ssl: trust-store-type: trust-store-location: trust-store-password: properties: sasl.mechanism: PLAIN sasl_ssl: security: protocol: SASL_SSL ssl: trust-store-type: JKS trust-store-location: file:///C:/Users/adityaputhuparambil/ltec-com-strimzi.jks trust-store-password: secret properties: sasl.mechanism: SCRAM-SHA-512 sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret; ssl.endpoint.identification.algorithm: |
2. SSL : Listener using TLS encryption and, optionally, authentication using TLS client certificates.3. SASL_PLAINTEXT using Plain mechanism:
...
DMaap-Message-router-kafka by defaullt uses SASL_PLAINTEXT.
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
kafka
sasl_plaintext:
security:
protocol: SASL_PLAINTEXT
ssl:
trust-store-type:
trust-store-location:
trust-store-password:
properties:
sasl.mechanism: PLAIN |
The kafka configuration details could be configured in the override files as below:
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
kafka: security: protocol: '{{ .Values.kafka.sasl_plaintext.security.protocol }}' ssl: trust-store-type: '{{ .Values.kafka.sasl_plaintext.security.trust-store-type }}' trust-store-location: '{{ .Values.kafka.sasl_plaintext.security.trust-store-location }}' trust-store-password: '{{ .Values.kafka.sasl_plaintext.security.trust-store-password }}' properties: sasl.mechanism: '{{ .Values.kafka.sasl_plaintext.proeprties.sasl_mechanism }}' sasl.jaas.config: '{{ .Values.kafka.sasl.jaas.config }}' |
...
3.
...
SASL_SSL
...
using
...
SCRAM-SHA-256
...
and
...
SCRAM-SHA-512
...
:
...
Implements
...
authentication
...
using
...
Salted
...
Challenge
...
Response
...
Authentication
...
Mechanism
...
(SCRAM).
...
SCRAM
...
credentials
...
are
...
stored
...
centrally
...
in
...
ZooKeeper.
...
SCRAM
...
can
...
be
...
used
...
in
...
situations
...
where
...
ZooKeeper
...
cluster
...
nodes
...
are
...
running
...
isolated
...
in
...
a
...
private
...
network.
Spring.kafka.ssl related configuration is required. In order to use TLS encryption and server authentication, a keystore containing private and public keys has to be provided. This is usually done using a file in the Java Key store (JKS) format.
...
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
spring: kafka: bootstrap-servers: ${KAFKA_BOOTSTRAP_SERVER} security: protocol: {{ .Values.kafka.security.protocol }} ssl: trust-store-type: {{ .Values.kafka.ssl.trust-store-type }} trust-store-location: {{ .Values.kafka.ssl.trust-store-location }} trust-store-password: {{ .Values.kafka.ssl.trust-store-password }} properties: sasl.mechanism: '{{ .Values.kafka.proeprties.sasl_mechanism }}' sasl.jaas.config: '{{ .Values.kafka.proeprties.sasl.jaas.config }}'; ssl.endpoint.identification.algorithm: # at producer end only producer: value-serializer: org.springframework.kafka.support.serializer.JsonSerializer # at consumer end only |
NOTE: Topics are auto generated in ONAP DMaaP Kafka. Hence topic creation is not covered in the scope on CPS.
POC :
POC was performed with DMaaPMessageRouterKafka running on k8e environment(172.16.1.205) in Nordix lab . The configuration details for both cps-core and cps-temporal as shared below:
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
spring: kafka: bootstrap-servers: 172.16.3.38:30490 security: protocol: SASL_PLAINTEXT properties: sasl.mechanism: |
...
PLAIN
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret;
ssl.endpoint.identification.algorithm:
producer:
# Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers
# See https://docs.spring.io/spring-kafka/docs/2.5.11.RELEASE/reference/html/#error-handling-deserializer
# and https://www.confluent.io/blog/spring-kafka-can-your-kafka-consumers-handle-a-poison-pill/
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: org.onap.cps.event.model.CpsDataUpdatedEvent
app:
kafka:
consumer:
topic: ${KAFKA_CONSUMER_TOPIC:cps.cfg-state-events} |
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
spring:
kafka:
bootstrap-servers: 172.16.3.38:30490
security:
protocol: SASL_PLAINTEXT
properties:
sasl.mechanism: PLAIN
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret;
ssl.endpoint.identification.algorithm:
consumer:
group-id: ${KAFKA_CONSUMER_GROUP_ID:cps-temporal-group}
# Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers
# See https://docs.spring.io/spring-kafka/docs/2.5.11.RELEASE/reference/html/#error-handling-deserializer
# and https://www.confluent.io/blog/spring-kafka-can-your-kafka-consumers-handle-a-poison-pill/
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: org.onap.cps.event.model.CpsDataUpdatedEvent
app:
kafka:
consumer:
topic: ${KAFKA_CONSUMER_TOPIC:cps.cfg-state-events} |