Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eta Encoder #7

Open
kmalhotra30 opened this issue Aug 12, 2024 · 2 comments
Open

Eta Encoder #7

kmalhotra30 opened this issue Aug 12, 2024 · 2 comments

Comments

@kmalhotra30
Copy link

kmalhotra30 commented Aug 12, 2024

Hi,

I see that the eta encoder is not being trained. Is this correct? If yes, is this intentional? Thanks

@lianruizuo
Copy link
Owner

In this GitHub repo, we used a pre-trained eta-encoder and frozen-weights. We realized that using a pretrained eta-encoder helps the attention mechanism to better capture subtle differences in different source contrasts. Pretrained weights of eta-encoder are incorporated in the HACA3 weights.

@kmalhotra30
Copy link
Author

Hi, thanks for your response.

  • I see that the pretrained weights are indeed provided.
  • However, pretrained weights can not always be used, especially for training the haca3 on our own dataset.
  • I see that there is code provided for training every other module (beta encoder, theta encoder, decoder, patchifier, attention module) except the eta encoder. It seems like the contrastive loss for eta encoder is also missing in the implementation (please correct me if I am wrong).

It would have been great if this exclusion of training implementation of eta encoder could have been explicitly mentioned in the readme.md. Furthermore, is it possible for you to provide code / implementation with the training of eta encoder included?

Once again, thanks for your response. Looking forward to hearing back.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants