-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dice score cannot be calculated for each class separately #1602
Comments
@asbjrnmunk would you be interested in working on this case and adding no reduction? |
Isn't Dice score equivalent to F1 score (link)? Mathematically it works out the same, not sure if there are some implementation nuances. If both are the same, maybe just add make an alias for people who prefer the name 'Dice' and a small line in docs of their eqivalence. |
Have the same issue, I used MulticlassF1Score with average None and it worked for me. (since dice is equivalent to F1 score : https://torchmetrics.readthedocs.io/en/stable/classification/f1_score.html#multiclassf1score:~:text=(values)-,MulticlassF1Score,-CLASS) OUTPUT : |
Hi any updates on this? it would be handy for Dice score to support |
🐛 Bug
Dice score cannot be calculated without reduction, instead raising a runtime error.
To Reproduce
Minimal reproducing example:
raises the following error
Same error is encountered with
average=None
.Expected behavior
The documentation states
while neither
'none'
norNone
works.Environment
conda
,pip
, build from source): pip, version 0.11.0 and 0.11.3.Additional context
Looking at the code, something seems fishy. Comparing the following two code snippets of
classification/dice.py
:https://github.com/Lightning-AI/metrics/blob/825d17f32ee0b9a2a8024c89d4a09863d7eb45c3/src/torchmetrics/classification/dice.py#L149-L151
and
https://github.com/Lightning-AI/metrics/blob/21b23b6d472ec542c764a789af63bd054fbb3512/src/torchmetrics/classification/dice.py#L167-L168
It seems line 167 is wrong since
average
is not modified between the two snippes.The text was updated successfully, but these errors were encountered: