Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

With the deprecation of Message Router, the auto creation of kafka topics will no longer be supported.
This will now be managed by Strimzi at helm chart level.
Applications should know/identify what topics they require up front and also the relevant access to those topics.
It is also recommended that all clients communicate with the kafka cluster using their preferred kafka client.


The examples below refer to the CPS project which uses spring-kafka to communicate with the cluster.

CPS both produces and consumes on the same topic.

For kafka clients that require a topic, this will need to be defined as a KafkaTopic custom resource in the relevant helm chart.

Code Block
{{- if .Values.config.useStrimziKafka }}
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaTopic
metadata:
  name: {{ .Values.config.dataUpdatedTopic.name }}
  labels:
    strimzi.io/cluster: {{ include "common.release" . }}-strimzi
spec:
  partitions: {{ .Values.config.dataUpdatedTopic.partitions }}
  config:
    retention.ms: {{ .Values.config.dataUpdatedTopic.retentionMs }}
    segment.bytes: {{ .Values.config.dataUpdatedTopic.segmentBytes }}
{{- end }}


For kafka clients that require access (read/write) to a topic, this will need to be defined as a KafkaUser custom resource in the relevant helm chart.

Code Block
{{- if .Values.config.useStrimziKafka }}
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaUser
metadata:
  name: {{ include "common.release" . }}-{{ .Values.global.cpsKafkaUser }}
  labels:
    strimzi.io/cluster: {{ include "common.release" . }}-strimzi
spec:
  authentication:
    type: scram-sha-512
  authorization:
    type: simple
    acls:
    - resource:
        type: group
        name: {{ .Values.config.dataUpdatedTopic.consumer.groupId }}
      operation: Read
    - resource:
        type: topic
        name: {{ .Values.config.dataUpdatedTopic.name }}
      operation: Read
    - resource:
        type: topic
        name: {{ .Values.config.dataUpdatedTopic.name }}
      operation: Write
{{- end }}

and the relevant values in values.yaml:

Code Block
global:
  kafkaBootstrap: strimzi-kafka-bootstrap
  cpsKafkaUser: cps-kafka-user

config:
  useStrimziKafka: true
  dataUpdatedTopic:
    name: cps.data-updated-events
    partitions: 10
    retentionMs: 7200000
    segmentBytes: 1073741824
    consumer:
      groupId: cps-temporal-group

Strimzi creates a kubernetes secret with the same name as the KafkaUser.