You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After cloned this code, we have been trying to replicate Han's work in pruning conv-nets, especially the large one like caffenet, a variant of AlexNet.
In pruning fc6, the first fully connected layer in the network, we observed that, this layer can be pruned with the sparsity at most 0.4444444, that is, when you set the parameter sparse_ratio to any number larger than 0.45, the actual sparsity is 0.4444444. This is rather weird to me.
Have you ever encountered this problem? Or could you please share your advice?
Thanks again!
Cheers!
The text was updated successfully, but these errors were encountered:
Hi, @may0324
Thanks for your cool work!
After cloned this code, we have been trying to replicate Han's work in pruning conv-nets, especially the large one like caffenet, a variant of AlexNet.
In pruning fc6, the first fully connected layer in the network, we observed that, this layer can be pruned with the sparsity at most 0.4444444, that is, when you set the parameter
sparse_ratio
to any number larger than 0.45, the actual sparsity is 0.4444444. This is rather weird to me.Have you ever encountered this problem? Or could you please share your advice?
Thanks again!
Cheers!
The text was updated successfully, but these errors were encountered: