Skip to content

Commit

Permalink
added CDC Connector
Browse files Browse the repository at this point in the history
  • Loading branch information
jr-marquez committed Feb 23, 2021
1 parent 0075e93 commit a1fccbb
Show file tree
Hide file tree
Showing 26 changed files with 142 additions and 1 deletion.
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Introduction

This project provides a source connector to capture changes from an
Oracle database.

# Documentation

Documentation on the connector is hosted on Confluent's
[docs site](https://docs.confluent.io/current/connect/kafka-connect-oracle-cdc/).

Source code is located in Confluent's
[docs repo](https://github.com/confluentinc/docs/tree/master/connect/kafka-connect-oracle-cdc). If changes
are made to configuration options for the connector, be sure to generate the RST docs (as described
below) and open a PR against the docs repo to publish those changes!

# Configs

Documentation on the configurations for each connector can be automatically generated via Maven.

To generate documentation for the sink connector:
```bash
mvn -Pdocs exec:java@sink-config-docs
```

To generate documentation for the source connector:
```bash
mvn -Pdocs exec:java@source-config-docs
```

# Compatibility Matrix:

This connector has been tested against the following versions of Apache
Kafka and Oracle CDC:

| | AK 1.0 | AK 1.1 | AK 2.0 |
| ------------------------ | ------------------ | ------------- | ------------- |
| **Oracle** | NOT COMPATIBLE (1) | TBD| TBD |

1. The connector needs Connect headers and source connector access to
offsets.
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#
# Copyright [2019 - 2019] Confluent Inc.
#

# Sample configuration for a standalone Kafka Connect worker that uses Avro serialization and
# integrates the the SchemaConfig Registry.
#
# This sample configuration assumes a local installation of Confluent Platform with all services
# running on their default ports, and a local copy of the connector project.

# Bootstrap Kafka servers. If multiple servers are specified, they should be comma-separated.
bootstrap.servers=confluent:9092

# The converters specify the format of data in Kafka and how to translate it into Connect data.
# Every Connect user will need to configure these based on the format they want their data in
# when loaded from or stored into Kafka
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://confluent:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://confluent:8081

# The internal converter used for offsets and config data is configurable and must be specified,
# but most users will always want to use the built-in default. Offset and config data is never
# visible outside of Connect in this format.
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

# Local storage file for offset data
offset.storage.file.filename=/tmp/connect.offsets

# Confuent Control Center Integration -- uncomment these lines to enable Kafka client interceptors
# that will report audit data that can be displayed and analyzed in Confluent Control Center
# producer.interceptor.classes=io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor
# consumer.interceptor.classes=io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor

# Load our plugin from the directory where a local Maven build creates the plugin archive.
# Paths can be absolute or relative to the directory from where you call the Connect worker.
plugin.path=target/components/packages/
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#
# Copyright [2019 - 2019] Confluent Inc.
#

log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n

log4j.logger.org.apache.zookeeper=ERROR
log4j.logger.org.I0Itec.zkclient=ERROR
log4j.logger.org.reflections=ERROR
log4j.logger.org.eclipse.jetty=ERROR
#log4j.logger.org.apache.kafka.connect.runtime=DEBUG
#log4j.logger.org.apache.kafka.clients.consumer.KafkaConsumer=DEBUG

#log4j.logger.io.confluent.connect.oracle=DEBUG

# Uncomment this line to periodically log per-table RPS stats. Logging interval is controlled by
# `table.rps.logging.interval.ms`
#log4j.logger.io.confluent.connect.oracle.cdc.metrics=DEBUG

# Uncomment this line to log SQL statements
#log4j.logger.io.confluent.connect.oracle.cdc.logging=DEBUG
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
name=SimpleOracleCDC
tasks.max=1
connector.class=io.confluent.connect.oracle.cdc.OracleCdcSourceConnector
redo.log.consumer.bootstrap.servers=localhost:9092
confluent.topic.bootstrap.servers=localhost:9092
log.topic.name=redo-log-topic
table.inclusion.regex=test-table
connection.pool.max.size=10
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
{
"name" : "kafka-connect-oracle-cdc",
"version" : "1.0.3",
"title" : "Kafka Connect OracleCDC Connector",
"description" : "The Confluent Oracle CDC Source Connector is a Premium Confluent connector and requires an additional subscription, specifically for this connector.\n<p>The Oracle CDC Source Connector captures changes in an Oracle database and writes the changes as change event records in Kafka topics. The connector uses Oracle LogMiner to read the database's redo log and requires supplemental logging with \"ALL\" columns. The connector supports Oracle 11c, 12c, and 18c. It supports both container databases and non-container databases, and supports databases running on-premises or in the cloud.\n<p>The connector can be configured to capture a subset of the tables in a single Oracle database. The captured tables are all tables accessible by the user that match an \"include\" pattern and do not match a separate \"exclude\" pattern. The connector can optionally begin by taking a snapshot of each of the tables, to capture all rows in their current state before changes are recorded. The connector then continues by capturing the individual row-level changes committed by database users. It writes these change events to Kafka topics using flexible table-to-topic mapping. By default, all change events from each table are written to a separate Kafka topic.",
"owner" : {
"username" : "confluentinc",
"name" : "Confluent, Inc."
},
"support" : {
"summary" : "This connector is a Confluent Premium Connector and <a href=\"https://www.confluent.io/subscription/\">supported by Confluent</a>. The Confluent Oracle CDC Source Connector requires purchase of a <a href=\"https://www.confluent.io/product/confluent-platform/\">Confluent Platform</a> subscription, including a license to this Premium Connector. You can also use this connector for a 30-day trial without an enterprise license key - after 30 days, you need to purchase a subscription. Please contact your Confluent account manager for details."
},
"tags" : [ "change data capture", "database", "CDC", "dbms", "relational", "Oracle" ],
"features" : {
"supported_encodings" : [ "any" ],
"single_message_transforms" : true,
"confluent_control_center_integration" : true,
"kafka_connect_api" : true
},
"documentation_url" : "https://docs.confluent.io/kafka-connect-oracle-cdc/current/",
"docker_image" : { },
"license" : [ {
"name" : "Confluent Software Evaluation License",
"url" : "https://www.confluent.io/software-evaluation-license"
} ],
"component_types" : [ "source" ],
"release_date" : "2021-02-04"
}
2 changes: 1 addition & 1 deletion terraform/utils/instance.sh
Original file line number Diff line number Diff line change
Expand Up @@ -46,5 +46,5 @@ bash -c "$SCRIPT2"
chmod 666 /var/run/docker.sock
docker login -u ${docker_login} -p ${docker_password}
docker-compose up -d
sleep 120
sleep 280
docker-compose exec oracle /scripts/go_sqlplus.sh /scripts/oracle_setup_docker

0 comments on commit a1fccbb

Please sign in to comment.