Changed Data Capture From CockroachDB To ConfluentCloud
Here's a simple tutorial for sending data from CockroachDB directly to Confluent Cloud using CockroachDB Change Data Capture, referred to as a Changefeed.
Join the DZone community and get the full member experience.Join For Free
Here's a simple tutorial for sending data from CockroachDB directly to Confluent Cloud using CockroachDB Change Data Capture, which is typically referred to as a Changefeed. This example can be applied to CockroachCloud or a self-hosted deployment of CockroachDB. This tutorial was tested on CockroachDB 20.2 and ConfluentCloud 1.25.
Set Up Confluent Cloud
Set Up Your Kafka Cluster
Get Kafka Resource ID
The ID list here for your Kafka cluster will be needed in the steps below
Create API Keys
The API Key and API Secret are needed for creating the CockroachDB Changefeed
Get Kafka End Point
The end point is needed to connect the Changefeed to Kafka
Start a Kafka Consumer To Verify Your Change Data Feed
Set Up CockroachDB or CockroachCloud
Do note that Changefeeds do not currently work on CockroachCloud Free-Tier. Use a Dedicated cluster to try this instead.
Open a new terminal window and leave the Kafka consumer one open for later. Log in the
cockroach sql command line and enter the following commands.
First, ensure rangefeeds are enabled.
Next, create a table.
Create a Changefeed
When creating the changefeed, notice that you'll use
kafka:// instead of using the returned endpoints earlier in ConfluentCloud (ie.
SASL_SSL://). Also, be sure to include your API Key and Secret in the Changefeed.
Insert Some Rows
Verify Data Is Showing up in Your Consumer App
Published at DZone with permission of Chris Casano. See the original article here.
Opinions expressed by DZone contributors are their own.