Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning rate #45

Open
Zawan-uts opened this issue Mar 26, 2019 · 4 comments
Open

Learning rate #45

Zawan-uts opened this issue Mar 26, 2019 · 4 comments

Comments

@Zawan-uts
Copy link

Where can we change the learning rate of nadam optimizer? Which optimizer is chosen if we don't specify in the params constructor?

@nreimers
Copy link
Member

By default, nadam is chosen.

You can change the learning rate updates in the following way:

model = BiLSTM(params)
model.setMappings(mappings, embeddings)
model.setDataset(datasets, data)
model.learning_rate_updates = {'nadam': {1:0.01, 5:0.002}}

If nadam is chosen as optimizer, it will start with a learning rate of 0.01. In epoch 5, it is updates to 0.002 learning rate.

If you choose a different optimizer, for example adam, replace 'nadam' by 'adam'

@Zawan-uts
Copy link
Author

But in BiLSTM.py i have this line 61,
self.learning_rate_updates = {'sgd': {1: 0.1, 3: 0.05, 5: 0.01}}

Does that mean 'sgd' has been chosen for now?

@nreimers
Copy link
Member

No.
In the dict you can specify the learning rate updates for different optimizers, e.g.

self.learning_rate_updates = {'sgd': {1: 0.1, 3: 0.05, 5: 0.01}, 'nadam': {1:0.01, 5:0.002}}

if SGD is used, the updates from self.learning_rate_updates['sgd'] are used. If nadam is used, self.learning_rate_updates['nadam'] is used. If self.learning_rate_updates[optimizer] is not specified, the default learning rate from Keras is used.

@Zawan-uts
Copy link
Author

OK thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants