Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about initialize in lstm example #76

Open
danliu2 opened this issue Oct 11, 2016 · 0 comments
Open

about initialize in lstm example #76

danliu2 opened this issue Oct 11, 2016 · 0 comments

Comments

@danliu2
Copy link

danliu2 commented Oct 11, 2016

hi,
In LSTM example, every worker initialized their own parameters with different random seed. This seems to be a bug for all-reduce update: because the elastic force for each worker is based on different central. This may be not so serious for param-sync mode, there for all workers share one unit central memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant