{{announcement.body}}
{{announcement.title}}

Configure Salesforce Platform Events Source Connector

DZone 's Guide to

Configure Salesforce Platform Events Source Connector

I'm gonna show how to send data from Salesforce Platform Events to Kafka topic by setting up a Salesforce Platform Event Source Connector.

· Big Data Zone ·
Free Resource

In this article, I'm gonna show how to send data from Salesforce Platform Events to Kafka topic by setting up a Salesforce Platform Event Source Connector and using a property file as the source of the configuration in Kafka Connect.

Prerequisites

Instructions

1. Create a new Connected App in Salesforce. Go to Setup, Search for App Manager in the Quick Find box and click the App Manager then on the App Manager page, click the New Connected App to set up a new API client.

We need the following details from Salesforce:

  • Consumer Key — Can be retrieve from the Connected App.
  • Consumer Secret - Can be retrieve from the Connected App.
  • Username — Username you used to login to Salesforce.
  • Password — Password you used to login to Salesforce.
  • Security Token — Can be generated from your Profile Setting in Salesforce.

manage connected apps

2. To create a Platform Event in SalesforceFrom Setup, enter Platform Events in the Quick Find box, then select Platform Events. On the Platform Events page, click New Platform Event.

You can refer to this Trailhead Module on how to set up.

https://trailhead.salesforce.com/en/content/learn/modules/platform_events_basics/platform_events_define_publish 

setup platform event

3. Install the Kafka Connect Salesforce Connector by executing the command

confluent-hub install confluentinc/kafka-connect-salesforce:latest on your Confluent Platform Server. 

root@ip

4. Restart the Confluent services by executing the command confluent local stop and confluent local start

root@ip

5. Go to the Confluent Control Center http://<Public IP>:9021

On the Connect tab, Click Add Connector and you should see all available connectors including the newly installed Salesforce Source and Sink Connectors.

confluent

6. Create a new topic in the Confluent Control Center by going to Topics tab and clicking the Add a Topic button. Used the default settings to create.

Topic Name: sf_cloud_news_pe_inchoosing a topic

7. Create a .properties file containing the desire configurations for Salesforce Platform Event Source Connector, save it to your local as we will upload it to the Confluent Control Center

Salesforce Platform Event Source Connector Configuration Details

Properties files
 




xxxxxxxxxx
1
32


1
# Connector Name
2
name=sf-cloud-news-pe-source-connector
3
# You can define more than 1 task handler for performance tuning/scaling
4
tasks.max=1
5
 
          
6
# Type of Connector
7
connector.class=io.confluent.salesforce.SalesforcePlatformEventSourceConnector
8
 
          
9
# Data converter class — using JSON
10
key.converter=org.apache.kafka.connect.json.JsonConverter
11
value.converter=org.apache.kafka.connect.json.JsonConverter
12
 
          
13
# Salesforce Credentials
14
salesforce.consumer.key=<SALESFORCE CONNECTED APP CONSUMER KEY>
15
salesforce.consumer.secret=<SALESFORCE CONNECTED APP CONSUMER SECRET>
16
salesforce.username=<SALESFORCE USERNAME>
17
salesforce.password=<SALESFORCE PASSWORD>
18
salesforce.password.token=<SALESFORCE SECURITY TOKEN>
19
salesforce.instance=https://login.salesforce.com
20
 
          
21
# Salesforce Platform Event Name
22
salesforce.platform.event.name=Cloud_News__e
23
salesforce.initial.start=all
24
 
          
25
# Target Topic
26
kafka.topic=sf_cloud_news_pe_in
27
 
          
28
# License Setting — Default settings for trail/local version
29
confluent.topic.replication.factor=1
30
confluent.topic.bootstrap.servers=localhost:9092
31
confluent.license=
32
 
          



sf-cloud-news-pe-source-connector.properties

sf-cloud-news

8. On the Confluent Control Center, go to the Connect page then click Upload Connector Config File, browse and select for the .properties file

connectors

6. You should see the Connector Details Page, All configuration parameters from the file should be replicated in the form, Scroll down and click Continue.

add connector

control center

7. Click Lunch to deploy the Connector.

add connector

8. You should see that the connector status is running. In case the connector is failing, you can check and view the connector logs by sending a get request to http://<HOST>:8083/connectors/<CONNECTOR NAME>/status or executing the command confluent local status <CONNECTOR NAME> in the server.

sf-cloud

connectors

Testing the Connector

1. Publishing a new record to Platform Event Cloud_News__e by executing the below APEX code in Salesforce Developer Console.

enter apex code

acn-v1

2. On the Confluent Control Center, we should see that the record from Salesforce was sent to the topic  sf_cloud_news_pe_in 

sf-cloud-news

That's it, we just learned how to set up a Salesforce Platform Event Connector as a source in Kafka Connect by simply uploading a property file containing the custom configurations.

For more details, you can check this documentation from Confluent.

https://docs.confluent.io/current/connect/kafka-connect-salesforce/index.html

I hope this helps!

Topics:
big data, confluent, kafka connect platform, kafka connectors, salesforce configuration, streaming, tutorial

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}