You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Your work is so excellent! However, when I try to fine-tuning on my downstream regression task, I found that when using Mixup method, the parameter of the function contains: num_classes=args.nb_classes. Due to the special nature of my task, I have the output target of only one class. So there's two possible solution for me:
Set the num_classes to 1 naively.
Disable the Mixup method. Besides, if I use Mixup, the loss function for my regression task should also find a proper one.
I'm confuse to use which solution or other better solution....I'm looking forward to your answer!
The text was updated successfully, but these errors were encountered:
Your work is so excellent! However, when I try to fine-tuning on my downstream regression task, I found that when using Mixup method, the parameter of the function contains:
num_classes=args.nb_classes
. Due to the special nature of my task, I have the output target of only one class. So there's two possible solution for me:Besides, if I use Mixup, the loss function for my regression task should also find a proper one.
I'm confuse to use which solution or other better solution....I'm looking forward to your answer!
The text was updated successfully, but these errors were encountered: