Skip to content

Commit

Permalink
word_language_model: Fix Transformer init_weights
Browse files Browse the repository at this point in the history
Model was not getting initialized property since it was using the
decoder object instead of decoder weight to initialize zeros.

Signed-off-by: Eli Uriegas <[email protected]>
  • Loading branch information
seemethere committed Jun 15, 2020
1 parent edfa9f2 commit 13acec6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion word_language_model/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ def _generate_square_subsequent_mask(self, sz):
def init_weights(self):
initrange = 0.1
nn.init.uniform_(self.encoder.weight, -initrange, initrange)
nn.init.zeros_(self.decoder)
nn.init.zeros_(self.decoder.weight)
nn.init.uniform_(self.decoder.weight, -initrange, initrange)

def forward(self, src, has_mask=True):
Expand Down

0 comments on commit 13acec6

Please sign in to comment.