Event Discovery Agent with AsyncAPI output #147
Closed
gregmeldrum
started this conversation in
zArchived - AsyncAPI Hack 2021 - Submissions
Replies: 1 comment
-
Hey folks, in case you want to donate this project to AsyncAPI org, please follow the same approach as Souvik -> #174 So basically make an official offer and then we can ask AsyncAPI TSC to vote on it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Event Discovery Agent Introduction
Have you ever wondered which events are flowing through your message broker? If so, then the Event Discovery Agent is here to help you out! Simple point the agent at your broker, run a scan, and retrieve an AsyncAPI document describing the events.
What brokers are supported?
The current version has well-tested plugins that support Apache Kafka and Solace PubSub+ brokers, plus a plugin to scan a Confluent Schema Registry. There are also untested plugins that support NATS, RabbitMQ and HiveMQ brokers.
What AsyncAPI spec version is supported?
The Agent supports a subset of the 2.2.0 specification.
Wait, this is free?
Yep, free, open-source, Apache 2.0.
Where can I get the Event Discovery Agent?
https://github.com/SolaceLabs/event-discovery-agent
What about broker XYZ?
Glad you asked, the Event Discovery Agent is built with an extendable plugin architecture. The plugin developer's guide will guide you through creating a plugin for your XYZ broker. Also, the ui developer's guide will walk you through creating a UI for your custom broker plugin.
I'm sold! How do I use it?
The Agent has a REST API that is used to initiate an asynchronous broker scan and then retrieve the AsyncAPI results. For more information on the REST API, see the rest api documentation. Here is an example of how to initiate an Apache Kafka Discovery:
Sorry, I'm not a techie, REST APIs are too hard for me. Is there another way to interact with the Agent?
No worries, there is a built-in user interface for the plugins (Apache Kafka, Confluent SchemaRegistry + Kafka, Solace PubSub+, NATS, HiveMQ and RabbitMQ). You can access the user interface at http://localhost:8120 after starting the agent. See ui doc for additional details.
So how does the event collection actually work?
This depends on the plugin. For Apache Kafka, a message listener pulls the last record from each topic and reverse-engineers a JsonSchema representation of the payload (obviously only works for json payload). For the Confluent Schema Registry plugin, the agent pulls the latest schema from each subject and depending on the SubjectNameStrategy determines the topic the schema belongs to. For all other plugins (Nats, RabbitMQ, HiveMQ and Solace PubSub+) the agent creates a passive message listener that attracts messages depending on the supplied subscriptions. The last message on each topic is used to create a JsonSchema representation of the payload (again, only works for json payload).
This sounds too good to be true, what are the limitations?
The Event Discovery Agent does have limitations. I prefer to call them opportunities since these are areas that we can improve through community contributions.
What is the future of the agent
The sky is the limit. Many of the limitations could be addressed by new features:
Contributions are welcome!
Beta Was this translation helpful? Give feedback.
All reactions