DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Coding
  3. Tools
  4. Fast, Easy Access to Secure Kafka Clusters

Fast, Easy Access to Secure Kafka Clusters

Connecting to your secure Kafka clusters is a straightforward process. Here's how to configure your broker, client, and StreamSets Connector for access.

Harikiran Nayak user avatar by
Harikiran Nayak
·
Sep. 05, 17 · Tutorial
Like (2)
Save
Tweet
Share
9.47K Views

Join the DZone community and get the full member experience.

Join For Free

It’s simple to connect StreamSets Data Collector (SDC) to Apache Kafka through the Kafka Consumer Origin and Kafka Producer Destination connectors. And because those connectors support all Kafka Client options, including the secure Kafka (SSL and SASL) options, connecting to an SSL-enabled secure Kafka cluster is just as easy. In this blog post, I'll walk through the steps required.

Secure Kafka Broker Configuration

First, the Kafka broker must be configured to accept client connections over SSL. Please refer to the Apache Kafka Documentation to configure your broker. If your Kafka cluster is already SSL-enabled, you can look up the port number in your Kafka broker configuration file (or the broker logs). Look for the listeners=SSL://host.name:port configuration option. To ensure that the Kafka broker is correctly configured to accept SSL connections, run the following command from the same host that you are running SDC on. If SDC is running from within a docker container, log in to that docker container and run the command.

$openssl s_client -debug -connect host.name:port -tls1

The above command should say “CONNECTED” and print out the certificates as shown below:

CONNECTED(00000003)
write to 0x7fae22c02e50 [0x7fae23816800] (100 bytes => 100 (0x64))
0000 - 16 03 01 00 5f 01 00 00-5b 03 01 59 94 d1 29 a1   ...._...[..Y..).
0010 - a3 77 37 3d f1 51 ea ab-eb 54 ee 64 36 1e 39 b5   .w7=.Q...T.d6.9.
...

If you don’t see output like this, double-check the broker configuration file. (Remember that you’ll need to restart the broker after changing the configuration.)

Secure Kafka Client Configuration

The next step is to prepare the Keystore and Truststore files which will be used by Kafka clients and SDC Kafka connectors. The Apache Kafka Documentation shows how to generate a Certificate Authority (CA) and self-signed certificates and import them into the keystore and truststore (JKS).

However, if you already have an existing certificate key pair, you will need to convert it to PKCS12 format before importing into JKS. The following steps describe this in detail.

  • Convert the key/cert to PKCS12 format and import into the JKS:
$ openssl pkcs12 -export -in {your-cert-file} -inkey {your-key-file} > host.p12

$ keytool -importkeystore -srckeystore host.p12 -destkeystore client.keystore.jks -srcstoretype pkcs12


  • Import the CA into the client Keystore:
$ keytool -keystore client.keystore.jks -alias localhost -import -file CA


  • Import the CA into the client Truststore:
$ keytool -keystore client.truststore.jks -alias CARoot -import -file CA


Now your keystore and truststore files are ready for use. Verify that everything has been set up correctly using the built in Kafka Console Consumer and Producers by doing the following:

  • Create a “client-ssl.properties” file and copy this configuration:

security.protocol=SSL
ssl.truststore.location=/full/path/to/your/client/truststore/client.truststore.jks
ssl.truststore.password=*****
ssl.keystore.location=/full/path/to/your/client/keystore/client.keystore.jks
ssl.keystore.password=*****
ssl.key.password=*****

(Note: “ssl.key.password” is the password of the private key. The private key password and keystore password must be the same when using JKS.)

  • Start the Kafka Console Consumer:
$ kafka-console-consumer.sh --bootstrap-server host.name:port --topic test --new-consumer --consumer.config client-ssl.properties


  • Start the Kafka Console Producer:
$ kafka-console-producer.sh --broker-list host.name:port --topic test --producer.config client-ssl.properties


Type a message in the producer console and verify that the consumer receives it. This confirms that your Kafka setup is accurate.

StreamSets Kafka Connector Configuration

Finally, you should configure the SDC Kafka connectors – which is by far the easiest part! When writing to Kafka, use the “Kafka Configuration” option in Kafka Producer destination to pass the security-related options as shown in the screenshot below:

If you are reading data from Kafka, use the same options in your Kafka Consumer Origin. Please refer to the documentation for details on the options. As always, if you have questions while you are configuring your StreamSets Data Collector Pipelines, you can ask in the Community Slack or on the Community mailing list.

kafka Docker (software) security

Published at DZone with permission of Harikiran Nayak, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Load Balancing Pattern
  • Why You Should Automate Code Reviews
  • 2023 Software Testing Trends: A Look Ahead at the Industry's Future
  • Kubernetes vs Docker: Differences Explained

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: