Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No batchnorm layer in your source? #19

Open
SarthakYadav opened this issue Jul 20, 2017 · 3 comments
Open

No batchnorm layer in your source? #19

SarthakYadav opened this issue Jul 20, 2017 · 3 comments

Comments

@SarthakYadav
Copy link

I am trying to use Resnet50. But your src has no files corresponding to batch_norm_layers, therefore the binary has no concept of "batch_norm". How can one build it with batch_norm support? The latest caffe code for the same obviously throws a lot of errors

@kevinlin311tw
Copy link
Owner

kevinlin311tw commented Jul 25, 2017 via email

@SarthakYadav
Copy link
Author

I actually added the source code for the loss function implementations. They wouldn't compile with new Caffe versions

@kevinlin311tw
Copy link
Owner

kevinlin311tw commented Aug 31, 2017

Sorry for the late reply. I am too busy with current projects.

If you want to generate binary codes quickly and avoid additional implementations, you can just train the model with latent layer + sigmoid activation function + softmax loss (this is exactly our workshop paper). It already learns high-quality and somehow evenly-distributed binary codes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants