Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

a bug in EuclideanCodebook #18

Open
krgy12138 opened this issue Oct 5, 2024 · 0 comments
Open

a bug in EuclideanCodebook #18

krgy12138 opened this issue Oct 5, 2024 · 0 comments

Comments

@krgy12138
Copy link

if self.training: # We do the expiry of code at that point as buffers are in sync # and all the workers will take the same decision. self.expire_codes_(x) ema_inplace(self.cluster_size, embed_onehot.sum(0), self.decay) embed_sum = x.t() @ embed_onehot ema_inplace(self.embed_avg, embed_sum.t(), self.decay) cluster_size = ( laplace_smoothing(self.cluster_size, self.codebook_size, self.epsilon) * self.cluster_size.sum() ) embed_normalized = self.embed_avg / cluster_size.unsqueeze(1) self.embed.data.copy_(embed_normalized)
self.embed_avg has the possibility of 0, and 0 will appear when calculating cluster_size. If the division by 0 situation occurs, you should add a piece of code

cluster_size = torch.clamp(cluster_size, min=epsilon)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant