Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion matrix interface suggestions #445

Open
neubig opened this issue Oct 22, 2022 · 2 comments
Open

Confusion matrix interface suggestions #445

neubig opened this issue Oct 22, 2022 · 2 comments

Comments

@neubig
Copy link
Contributor

neubig commented Oct 22, 2022

The new "confusion matrix" feature is a great start, but I felt the interface is a bit hard to understand:

Screen Shot 2022-10-22 at 6 47 21 AM

  1. "F1 by confusion matrix" is not correct. It should just be "confusion matrix".
  2. And also, usually a "confusion matrix" will have, for each true tag true_tag and predicted tag predicted_tag: cooccurrence(true_tag, predicted_tag)/count(true_tag), but right now it's just cooccurrence(true_tag, predicted_tag)/total_count. So we should either change the naming or the calculation.

cc @PaulCCCCCCH

@PaulCCCCCCH
Copy link
Collaborator

Thanks for creating the issue.

  1. Currently we display <metric> by <feature description> as the title. I guess we can simply remove the metric.
  2. For the accuracy, I can fix the calculation is another PR.

By the way, the axes seem weird. Are these the real feature values for the task, or is there a bug?

@neubig
Copy link
Contributor Author

neubig commented Oct 22, 2022

Thanks!

These are real values for a task. It's a very small example dataset though, so that's why the confusion is sparse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants