You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's possible to set both negative and positive class weights in a binary classification scenario. However, when a multi-class classification model is trained using a OVR solver, it's possible to set weight only for the positive (i.e. One) class, but the weight for the negative (i.e. Rest) class is always set to 1.
On 2020-06-16 19:51, Vladimir Bogachev wrote:
It's possible to set both negative and positive class weights in a
binary classification scenario. However, when a multi-class
classification model is trained using a OVR solver, it's possible to
set weight only for the positive (i.e. One) class, but the weight for
the negative (i.e. Rest) class is always set to 1.
The difference can be seen in
https://github.com/cjlin1/liblinear/blob/master/linear.cpp#L2552 where
train_one uses both weights and
https://github.com/cjlin1/liblinear/blob/master/linear.cpp#L2578 where
param->C is used for the negative weight.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub [1], or unsubscribe
[2]. [ { ***@***.***": "http://schema.org", ***@***.***": "EmailMessage",
"potentialAction": { ***@***.***": "ViewAction", "target":
"#64", "url":
"#64", "name": "View Issue"
}, "description": "View this Issue on GitHub", "publisher": { ***@***.***":
"Organization", "name": "GitHub", "url": "https://github.com" } } ]
Links:
------
[1] #64
[2]
https://github.com/notifications/unsubscribe-auth/ABI3BHWBDKJ4QATKK2RRPTDRW5MDDANCNFSM4N7RINWA
It's possible to set both negative and positive class weights in a binary classification scenario. However, when a multi-class classification model is trained using a OVR solver, it's possible to set weight only for the positive (i.e. One) class, but the weight for the negative (i.e. Rest) class is always set to 1.
The difference can be seen in https://github.com/cjlin1/liblinear/blob/master/linear.cpp#L2552 where train_one uses both weights and https://github.com/cjlin1/liblinear/blob/master/linear.cpp#L2578 where param->C is used. That corresponds to always using 1 as a weight.
That doesn't allow class weight normalization, and unnormalized weights bias the C term.
The text was updated successfully, but these errors were encountered: