You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for the super late reply. Because we want the current word to have 0 distance with itself, and the word in its left has -1, -2, -3 ... distance(from nearby left to further left), and the word in the right has 1, 2, 3 ... distance. However, as you already know the nn.Embedding() does not support negative indexes, therefore, the negative indexes in the left should be converted into positive ones. We conduct this conversion by adding all distance with self.origin_shift. By doing so, the distance will be something like [...., self.orign_shift-3, self.orign_shift-2, self.orign_shift-1, self.origin_shift, self.origin_shift+1, self.origin_shift+2, ...].
Hi,
Could you please give clarity on the significance of using 'num_embeddings' and 'origin_shift' when computing RelativeSinusoidalPositionalEmbedding?
self.origin_shift = num_embeddings//2 + 1
Thanks,
Neena
The text was updated successfully, but these errors were encountered: