You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi,
In LSTM example, every worker initialized their own parameters with different random seed. This seems to be a bug for all-reduce update: because the elastic force for each worker is based on different central. This may be not so serious for param-sync mode, there for all workers share one unit central memory.
The text was updated successfully, but these errors were encountered:
hi,
In LSTM example, every worker initialized their own parameters with different random seed. This seems to be a bug for all-reduce update: because the elastic force for each worker is based on different central. This may be not so serious for param-sync mode, there for all workers share one unit central memory.
The text was updated successfully, but these errors were encountered: