Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch normalization layer #11

Open
weizequan opened this issue Nov 27, 2017 · 1 comment
Open

Batch normalization layer #11

weizequan opened this issue Nov 27, 2017 · 1 comment

Comments

@weizequan
Copy link

In your *.prototxt, all batch norm layers;
batch_norm_param {
use_global_stats: false
}
But there is a detailed description http://caffe.berkeleyvision.org/tutorial/layers/batchnorm.html
"By default, it is set to false when the network is in the training phase and true when the network is in the testing phase."
Therefore, in your code, you set use_global_stats to false for both training and testing phase. Which is better?

@antingshen
Copy link
Owner

The *_deploy.prototxt files are the ones used for inference/testing, and they have use_global_stats: true, so this is the same as the default. Feel free to try removing it and see if it makes a difference during val, which I didn't think about. If you find a difference let me know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants