Caves were found, dragons discovered, and yet, light prevailed. I am happy to announce the initial release of this connector 🎉 !
This uses the Snowflake JDBC driver to directly write Avro data from Kafka into a flattened schema into the Snowflake ❄️ data warehouse.
Features included:
- Automatic table creation
- Automatic schema evolution
- Inclusion of Kafka metadata
As a general note, there is a performance issue with complex Avro data types like arrays, maps/dictionaries, and embedded objects. The connector will still work, but it's best for topics with lower volumes. Engineers may need to increase the standard poll times on the connector's consumer or decrease the batch size between commits.
At this time, this connector works best with primitives.