Nebius Israel
Contact usConsole
  • GPU
  • Docs
© 2023 Nebius Israel Ltd
Data Transfer
  • Available transfers
  • Getting started
  • Troubleshooting
  • Access management
  • Pricing policy
  1. Step-by-step guides
  2. Configuring endpoints
  3. Configuring target endpoints
  4. Apache Kafka®

Configuring Apache Kafka® target endpoints

  • Managed Service for Apache Kafka® cluster
  • Custom installation
  • Topic settings Apache Kafka®
  • Serializing settings

When creating or editing an endpoint, you can define:

  • Managed Service for Apache Kafka® cluster connection or custom installation settings and serialization settings, including those based on Compute Cloud VMs. These are required parameters.
  • Apache Kafka topic settings.

Managed Service for Apache Kafka® clusterManaged Service for Apache Kafka® cluster

Warning

To create or edit an endpoint of a managed database, you need the managed-kafka.viewer role or the primitive viewer role issued for the folder hosting a cluster of this managed database.

Connecting to the database with the cluster ID specified in Nebius Israel. Available only for clusters deployed in Managed Service for Apache Kafka®.

Management console
  • Managed Service for Kafka cluster: Select the cluster to connect to.

    • Username: Specify the username that Data Transfer will use to connect to the database.

    • Password: Enter the user's password to the database.

Custom installationCustom installation

Connecting to the database with explicitly specified network addresses.

Management console
  • Broker URLs: Specify the IP addresses or FQDNs of the broker hosts.

    If the Apache Kafka® port number differs from the standard one, specify it with a colon after the host name:

    <broker host IP or FQDN>:<port number>
    
  • SSL: Use encryption to protect the connection.

  • PEM Certificate: If encryption of transmitted data is required, for example, to meet the PCI DSS requirements, upload the certificate file or add its contents as text.

  • Endpoint network interface: Select or create a subnet in the desired availability zone.

    If the value in this field is specified for both endpoints, both subnets must be hosted in the same availability zone.

  • Authentication: Select the connection type (SASL or No authentication).

    If you select SASL:

    • Username: Specify the name of the account, under which Data Transfer will connect to the topic.
    • Password: Enter the account password.
    • Mechanism: Select the hashing mechanism (SHA 256 or SHA 512).

Topic settings Apache Kafka®Topic settings Apache Kafka®

Management console
  • Topic:

    • Topic full name: Specify the name of the topic to send messages to.

    • Topic prefix: Specify the topic prefix, similar to the Debezium database.server.name setting. Messages will be sent to a topic named <topic_prefix>.<schema>.<table_name>.

Data Transfer supports CDC for transfers from PostgreSQL, MySQL, and YDB databases to Apache Kafka®. Data is sent to the target in Debezium format. For more information about CDC mode, see Change data capture.

Note

In YDB, CDC mode is supported starting from version 22.5.

  • Save transactions order: Do not split an event stream into independent queues by table.

Serializing settingsSerializing settings

Management console
  • Serializing settings: Select the serialization type (Auto or Debezium).

    • Debezium serializer settings: Specify the Debezium serialization parameters.
© 2023 Nebius Israel Ltd
In this article:
  • Managed Service for Apache Kafka® cluster
  • Custom installation
  • Topic settings Apache Kafka®
  • Serializing settings