DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Integration Patterns in Microservices World
  • Low Code Serverless Integration With Kafka
  • Fixing Common Oracle Database Problems
  • Doris Lakehouse Integration: A New Approach to Data Analysis

Trending

  • How the Go Runtime Preempts Goroutines for Efficient Concurrency
  • Blue Skies Ahead: An AI Case Study on LLM Use for a Graph Theory Related Application
  • How to Practice TDD With Kotlin
  • AI, ML, and Data Science: Shaping the Future of Automation
  1. DZone
  2. Data Engineering
  3. Databases
  4. Integrate Oracle Database With Apache Kafka Using Debezium

Integrate Oracle Database With Apache Kafka Using Debezium

Oracle Databases are used for traditional enterprise applications and departmental systems in large enterprises. Debezium connector for Oracle is a great way to capture data changes from the transactional system of record and make them available for application use.

By 
Hugo Guerrero user avatar
Hugo Guerrero
DZone Core CORE ·
Aug. 16, 22 · Tutorial
Likes (1)
Comment
Save
Tweet
Share
7.2K Views

Join the DZone community and get the full member experience.

Join For Free

Oracle Databases are used for traditional enterprise applications, cloud-native use cases, and departmental systems in large enterprises. Debezium connector for Oracle is a great way to capture data changes from the transactional system of record and make them available for use cases such as replication, data warehousing, and real-time analytics.  

What Is Debezium?

Debezium is an open source distributed streaming platform for change data capture (CDC) that provides Apache Kafka Connect connectors for several databases, including Oracle.

Applications use AMQ Streams to consume change events. Debezium uses the Apache Kafka Connect framework, which makes all of Debezium’s connectors into Kafka Connector source connectors. They can be deployed and managed using AMQ Streams’ Kafka Connect custom Kubernetes resources.

Change Data Capture

Change data capture, or CDC, is a well-established software design pattern for a system that monitors and captures the changes in data so that other software can respond to those changes. CDC captures row-level changes to database tables and passes corresponding change events to a data streaming bus.

The connector enables you to connect your database and consume updates from the stream of events generated by your data integration process. The events stream can propagate changes back into the source database or perform other operations on the data in real-time.

The connector provides transaction capture, including data and schema changes to Kafka topics. They can be consumed by data integration processes such as streaming ETL or replication tools.

Connector Capabilities

Debezium ingests change events using Oracle’s native LogMiner database package. Oracle LogMiner is part of the Oracle Database utilities and provides a well-defined, easy-to-use, and comprehensive interface for querying online and archived redo log files.

The first time the Debezium Oracle connector starts, it performs an initial consistent snapshot of the database to see its entire history. You can change this behavior by setting the snapshot.mode. After the connector completes its initial snapshot, the Debezium connector continues streaming from the position it reads from the current system change number (SCN) position in the redo log. The initial snapshot ensures that the connector has a complete and consistent data set.

The Debezium connector for Oracle Databases supports the following database versions:

  • Oracle Database 12.2 EE
  • Oracle Database 19.3 EE
  • Oracle Database 21.3 EE

The Oracle connector used with Oracle Real Application Clusters (RAC) is in Technology Preview (TP) for this release.

In the following video, I describe a simple example of enabling the connectors to capture events from a containerized Oracle Database and then send those events to AMQ Streams running on an OpenShift cluster.

Check the complete supported configurations for more information.

Red Hat Application Foundations

The Debezium connector for Oracle databases is part of the supported components of Red Hat Application Foundations, an 100% open source initiative designed to help organizations modernize their applications with the latest technologies. The connector became General Available (GA) with the 1.9 release, and you can deploy it on Red Hat OpenShift and Red Hat Enterprise Linux.

The connector enables users to easily connect their databases using a single tool: no need for separate development teams or IT teams managing dozens of different systems! This means you can focus on building great products instead of worrying about how they'll work together once deployed.

Conclusion

As we can see, change data capture is one of the tools used to bridge traditional data stores and new cloud-native event-driven architectures. The Debezium connector is now part of the Red Hat Application Foundations (RHAF) family and is available for use in your applications. The connector supports AMQ Streams, a high-performance, scalable distribution of Apache Kafka that allows applications to consume data from many sources as event streams. 

Get started by downloading the Red Hat Application Foundations Debezium CDC connectors from the Red Hat Developer.

Database Oracle Database kafka Integration

Opinions expressed by DZone contributors are their own.

Related

  • Integration Patterns in Microservices World
  • Low Code Serverless Integration With Kafka
  • Fixing Common Oracle Database Problems
  • Doris Lakehouse Integration: A New Approach to Data Analysis

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!