Genpact Cora Knowledge Center

Support

Configuring Kafka Producers and Subscribers

V9.2

Overview

You configure Kafka activities to set up a messaging mechanism within Cora SeQuence or between Cora SeQuence and other applications. Apache Kafka® is a distributed streaming platform that is designed to be fast, scalable, and durable. Kafka is generally used to move data between systems or applications and to enable applications to consume data required to perform specific actions.

Use cases

Kafka messaging mechanism can be used in different scenarios. Following are a few examples:

  • Asynchronous communication with external systems: 
    • One-to-one communication scenario: 
      1. Receive messages from an ERP system to initiate an invoice approval workflow.
      2. After invoice approval, send message back to the ERP system for further processing.
  • Event publishing: 
    • One-to-many communication scenario: Send messages to a topic that has multiple subscribers. 
      1. Send a message to the topic after a payment process has completed.
      2. Multiple systems that subscribe to the topic perform an action based on the message. 
  • Internal communication:
    • Send a message from one process to another:
      • To start a sub-workflow in an asynchronous way, which prevent blocking and delays in the master workflow.
      • To resume workflow execution in another branch.
  • Multiple tasks per single message:
    • Handle a message sent from an external system to trigger multiple workflows:
      • A solution for welcoming a newly hired employee. One message with information about the new employee triggers multiple workflows, such as starting a workflow that requests the IT department to supply a laptop to the new employee, another workflow that requests the Security department to issue an ID tag, and yet another workflow for the HR department to add the employee to the relevant systems.

Configuration

Configuring a Kafka messaging on Cora SeQuence involves the following steps:

StepsPerformed by 
  1. Create the Kafka connection
Cora SeQuence Administrator
  1. Create a Kafka producer
Cora SeQuence Administrator
  1. Configure the Kafka integration activities in the workflow:
    1. Configure the Kafka Producer activity
    2. Configure the Kafka Subscriber activity
Developer

For more details on how Apache Kafka works in Cora SeQuence, see this article

Create the Kafka connection 

Configure the connection to the Kafka service. You can create new connections, edit existing ones, or delete them.  

Prerequisites

Before you create a Kafka connection string, make sure that you have:

  • A Kafka deployment (either SAAS or IAAS) is up and running.
  • The Kafka connection details, including the required credentials.

For more information on the Kafka server requirements, see this article.

Procedure

  1. Go to Administration > Global Settings > Kafka Connections, and click Add New Record.
  2. Enter a meaningful name for the Kafka connection string.
  3. Enter the Connection URL.
  4. Select a security protocol.
    Each protocol type requires different parameters. SASL SSL protocol is the recommended security protocol.
ProtocolDetails
PlainTextDoes not require SSL or credentials to connect.
SASLPlainTextDoes not require SSL, but credentials are required (Store).
SSL Does not require credentials, but you need to get the configuration settings from your System Admin to set up the required parameters: 
  • Key Store Type (Currently, only JKS)
  • Key Store Password
  • Key Store Location
  • Trust Store Type (Currently only JKS)
  • Key Store Password
  • Trust Store Location
SASLSSLCovers both the SASL protocol and the SSL protocol, providing better security features.
This is the recommended protocol when working with Cora SeQuence.

To learn more  about Kafka parameters, refer to the Kafka documentation.

Create a Kafka Producer

Define a Kafka producer to enable multiple activities to use the same producer definition. 

Prerequisite

  • Make sure that a Kafka connection exists.

Procedure

  1. Go to Administration > Global Settings > Kafka Connections, and click Add New Record.
  2. Enter a meaningful name for the Kafka Producer.
  3. Enter the Kafka connection.
    Non-mandatory parameters have default values. 

To learn more, refer to the Kafka documentation.

Configure the Kafka Producer activity

The producer connects to Kafka and holds the message object, which holds the topic name. Each workflow instance can define a different topic.

Prerequisite

  • A Kafka producer has been configured.

Procedure

  1. To add a Kafka Producer activity to your workflow, in the App Studio, select Integration>Kafka Producer.
  2. Click theKafka Producer activity.
  3. On the Kafka Producer Properties screen, enter a significant name and alias, and then click Next.
  4. Select a Kafka Producer from the list, or create a new one. 
  5. Click Next.
  6. Under KafkaRecord record Bindings, clickKafkaRecord message, and clear the null option.
  7. Click theplus button next to the KafkaRecord message node, to show the configurable properties.
    • Enter topic name.
      If the topic does not exist, depending on Kafka's configuration, a topic is automatically created.
    • Set the content and body of the message.
    • Set the key.
  8. Click Finish.

Configure the Kafka Subscriber activity

The subscriber connects to Kafka and retrieves the message from a topic. Each subscriber connects to a specific topic. Kafka subscriber is a JES job. 

Prerequisites

  • Obtain the relevant Kafka topic name from the Project Manager or Business Analyst.

Procedure

  1. To add a Kafka Subscriber activity to your workflow, in the App Studio, select Integration>Kafka Subscriber.
  2. Click theKafka Subscriber activity.
  3. On the Kafka Subscriber Activity Properties screen, enter a significant name for the activity, and then click Next.
  4. Enter an alias for the Kafka subscriber, and then click Next.
  5. On the job definition screen, define the following:
    • Name: Enter a name to the actual job performed by the Kafka Subscriber activity.
    • Job Host: the list includes the servers on which JES (Job Execution Service) is installed.
      Do one of the following:
      • Select a server on which to run the job.  
      • Leave the field blank, if you want the system to allocate a server to run the job.
    • Job is enabled: Select this option only after you complete the workflow, or if you want to run the job for testing purposes.
    • Job supports redundancy: option is selected by default. Clear the checkbox if you do not want to enable redundancy.
      For more details on job redundancy settings, see this article
  6. Click Next.
  7. On the Command tab, set the following:
    • Kafka Connection (mandatory): Select the relevant connection string.  
    • Group ID (mandatory): Enter the group ID relevant to your implementation.  
    • Topic Name (mandatory): Enter the Kafka topic name.
      If the topic does not exist, depending on Kafka's configuration, a topic is automatically created.
  8. Click Finish.

To learn more, refer to the Kafka documentation.