You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Brooklin does not allow creating multiple datastreams replicating from different source Kafka clusters/topics to the same destination Kafka cluster and topic.
Let's say we have the following Kafka clusters: K1, K2
We wish to replicate messages from topic T1 in K1 and K2 clusters to topic T2 in K1 cluster.
We manage to create the first datastream (K2-T1 -> K1-T2) but the creation of the second one (K1-T1 -> K1-T2) fails with the error:
Cannot create a BYOT datastream where the destination is being used by other datastream(s)
Your environment
Operating System: 3-node brooklin cluster running in centOS-7 VMs
Brooklin version: Compiling and deploying the brooklin source code currently in master
We expected that it would be possible to replicate from multiple source Kafka clusters/topics to the same Kafka destination cluster and topic.
Actual behaviour
The second DS fails when another DS with the same destination already exists. If we swap the broker order in the second DS destination connection string, we manage to successfully create the second DS.
The text was updated successfully, but these errors were encountered:
Subject of the issue
Brooklin does not allow creating multiple datastreams replicating from different source Kafka clusters/topics to the same destination Kafka cluster and topic.
Let's say we have the following Kafka clusters: K1, K2
We wish to replicate messages from topic T1 in K1 and K2 clusters to topic T2 in K1 cluster.
We manage to create the first datastream (K2-T1 -> K1-T2) but the creation of the second one (K1-T1 -> K1-T2) fails with the error:
Cannot create a BYOT datastream where the destination is being used by other datastream(s)
Your environment
Steps to reproduce
Datastream is created successfully.
We get the following error:
Cannot create a BYOT datastream where the destination is being used by other datastream(s)
Expected behaviour
We expected that it would be possible to replicate from multiple source Kafka clusters/topics to the same Kafka destination cluster and topic.
Actual behaviour
The second DS fails when another DS with the same destination already exists. If we swap the broker order in the second DS destination connection string, we manage to successfully create the second DS.
The text was updated successfully, but these errors were encountered: