/
(SM-2502) Kafka

(SM-2502) Kafka

Table of Contents:

Storage management supports writing data to Kafka. The integration provides the following function:

  • Automatic topic creation is derived from the source object name.

  • Sending JSON or AVRO serialized messages.

  • Integration with Schema Registry

The connectivity is tested with Confluent Kafka and Apache Kafka but should be compatible with any Kafka distribution or Kafka-compatible API.

General Prerequisites

Reuse Library release

Kafka integration is available in Reuse Library 2311 or higher.

SAP NetWeaver release

Storage management requires SAP ABAP stack NW 7.01 SP15 or higher.

Open Ports

To enable communication between SAP systems and Azure, outbound communication from the SAP system to the following services on the Kafka cluster needs to be allowed:

Port

Protocol

Target

Port

Protocol

Target

9092

TCP

Kafka host FQDN

2181

TCP

Zookeeper host FQDN

Java connector

Java connector is a critical middleware component used for communication between SAP and Kafka. Follow the installation steps described in the Java Connector Setup.
The minimum required Java connector version for Kafka integration is 238.

Kafka Client config file

Authentication to the Kafka cluster is configured using the standard client.config file. The file needs to be accessible to all SAP application servers. The recommended location is /sapmnt/<SID>/global/security/dvd_conn/client.config

Below you can see several example configuration files.

Confluent cloud

# Kafka cluster configuration bootstrap.servers=<bootstrap-server> security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="<api-key>" password="<api-secret>"; # Schema Registry configuration schema.registry.url=<schema-registry-url> basic.auth.credentials.source=USER_INFO basic.auth.user.info=<schema-registry-api-key>:<schema-registry-api-secret>

Apache Kafka with Kerberos

# Kafka producer configuration bootstrap.servers=<bootstrap-server> acks=all # Kerberos authentication configuration security.protocol=SASL_SSL sasl.mechanism=GSSAPI sasl.kerberos.service.name=kafka # JAAS configuration using keytab file sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="/path/to/bob.keytab" principal="bob@REALM"; # SSL truststore configuration ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks ssl.truststore.password=<truststore-password>

If you need to specify non-default krb5.conf file, you can do so by adding string -Djava.security.krb5.conf=/path/to/krb5.conf to parameter “Advanced → Additional java starting arguments” in t-code /dvd/jco_mng

In case the AVRO file type is used, the Schema Registry configuration is mandatory.

Create Kafka storage in Storage Management

Go to transaction /DVD/SM_SETUP
Create new storage of type KAFKA

image-20250219-090439.png

Storage ID

Logical identifier of the storage (maximum 10 characters)

Java connector RFC

RFC destination for communication with Java connector

Configuration file

When the button is pressed, a popup window appears with a text input field where you can enter one of two options:

  1. Physical path to the client.config file. It should be accessible from all application servers or the JCO host in standalone JCO deployments.

  2. You can paste the contents of a config file, where passwords can be encrypted using the pattern <pass><pass> (e.g., password="<pass>password123<pass>"). When storage is saved, the password will be automatically encrypted.

File Type

Format of messages to send to Kafka Topics. JSON or AVRO

Conversion type

In case Avro is used, a conversion type can be chosen. CSV or JCo table, default and recommended is CSV.

Write mode

Transactional / non-transactional. The transactional mode gives you an exact-once delivery guarantee but reduces throughput.

Use topic

Flag to create a topic during table activation. In case it is turned off topic must already exist on the target.

Replication factor

A replication factor of the created topic.

Topic Partitions

Several topic partitions topic are created.

Related content