- CPS-433Getting issue details... STATUS
VES-HV Collector
HV-VES collector has been proposed, based on a need to process high-volumes of data generated frequently by a large number of NFs. It uses plain TCP connections. Connections are stream-based (as opposed to request-based) and long running. Payload is binary-encoded (currently using Google Protocol Buffers).
Pros:
Designed to support high volume of data with minimal latency
HV-VES uses direct connection to DMaaP’s Kafka.
Cons:
Added dependency on HV-VES DCAE components
DMaaP Kafka :
Listener Configuration
Encryption and authentication in Kafka brokers is configured per listener.
Each listener in the Kafka broker is configured with its own security protocol. The configuration property listener.security.protocal defines which listener uses which security protocol. It maps each listener name to its security protocol.
Supported security protocols are
- PLAINTEXT
Listener without any encryption or authentication.
- SSL
Listener using TLS encryption and, optionally, authentication using TLS client certificates.
- SASL_PLAINTEXT
Listener without encryption but with SASL-based authentication.
- SASL_SSL
Listener with TLS-based encryption and SASL-based authentication.
DMaap-Message-router-kafka by defaullt uses SASL_PLAINTEXT.
Configuration required at the published end :
spring:
kafka:
bootstrap-servers: host:port
security:
protocol: SASL_PLAINTEXT
properties:
sasl.mechanism: PLAIN
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username=admin password=admin_secret;
ssl.endpoint.identification.algorithm:
producer:
# Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers
# See https://docs.spring.io/spring-kafka/docs/2.5.11.RELEASE/reference/html/#error-handling-deserializer
# and https://www.confluent.io/blog/spring-kafka-can-your-kafka-consumers-handle-a-poison-pill/
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: org.onap.cps.event.model.CpsDataUpdatedEvent
app:
kafka:
consumer:
topic: ${KAFKA_CONSUMER_TOPIC:cps.cfg-state-events}
Configuration at consumer end: