Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate a quality control report when exporting the classifier #14

Open
MaksHess opened this issue Mar 2, 2022 · 0 comments
Open

Generate a quality control report when exporting the classifier #14

MaksHess opened this issue Mar 2, 2022 · 0 comments

Comments

@MaksHess
Copy link
Collaborator

MaksHess commented Mar 2, 2022

Following up on a recent discussion with @jluethi, it would be nice to provide the user with a report assessing classifier performance and potential problems with the annotated data. Here's a collection of things that could be useful:

Scoring

  • Per-class F1 score (for for than 2 classes, could even be added to the status bar).
  • Confusion matrix.

Annotations

  • Feedback regarding class imbalance.
  • Prediction uncertainty (maybe interesting to show interactively to guide the user to annotate objects at the decision boundary).
  • 2D embedding of annotated points colored by their predicted class, with the annotations superimposed. This could give quick visual feedback whether only "clear-cases" were annotated.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant