You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Could we change the layer, such as ReLU -> PReLU, short-cut from 1x1 to 3x3...etc.
Or we could only modify the hyper-parameters, such as iteration, lr...etc.?
The score of the exercise only depends on performance of the Res110?
Res20 & Res56 not considered?
Thank you so much!
The text was updated successfully, but these errors were encountered:
You can change ReLU layer to another activation function, but it may not improve your net.
You also can modify hyper-parameters, and it may not improve your net.
Yes, the network performance score of your homework 1 depend on the result of ResNet-110.
However, you still need to report your result of ResNet-20 and ResNet-56.
Hi Assistants,
Could we change the layer, such as ReLU -> PReLU, short-cut from 1x1 to 3x3...etc.
Or we could only modify the hyper-parameters, such as iteration, lr...etc.?
The score of the exercise only depends on performance of the Res110?
Res20 & Res56 not considered?
Thank you so much!
The text was updated successfully, but these errors were encountered: