-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Administration GUI for collector-http crawler config #183
Comments
Many plans, little time! ;-) Seriously, our internal wish list for our open-source offering is quite big but a crawler GUI is currently low on that list. I am marking this as a feature request. |
Yes, this is not a core feature. Maybe there are other open source project which could provide a GUI given a .xsd or .dtd file. I found this project on git hub which may solve some part of the problem: https://github.com/davidmoten/xsd-forms Maybe there are other project as well which can be of interest here. |
You can give it a try and report the kind of success you get, but the reason this cannot be an all-purpose solution is because the XML definition for the collector is not static. We cannot release a one-size-fits-all XSD or DTD. People can add their own classes with their own custom configurable XML to them. We want to keep that flexibility. There is also the support for Velocity directives that would not work well with that in some cases (would break all XML parsers if it has not been interpreted by Velocity first). We could look into changing how configuration is implemented or maybe have each configurable class provide their DTD or something like that, but that's not planned. We want to keep being able to add your own classes as simple as possible, without much requirements. One day maybe... :-) But anything you find that can help in the meantime, please share. |
Creating a GUI which solves the entire configuration challenge in a usable way is of course a really big task. But providing a usable GUI which solve some part of the configuration challenge may be possible by using other projects. I may take a deeper look at it. |
I can envision an application that solves this by having 2-3 tables: This is all linked with JEF Monitor so that all crawls are integrated. |
So, I've thought more about this, and I'm thinking that a GUI is not the right way to go, at least not initially. It would be better to do this as a microservice, embedding collector-http. APIs allow to manipulate configurations, crawls based on configurations, and get status. Then, this can be integrated into the admin section of a GUI that uses the crawl. It also allows scaling as multiple collectors can be distributed to multiple hosts by a front-end. |
Are there any plans for creating an administration GUI for crawler configuration?
The text was updated successfully, but these errors were encountered: