-
An optimizer wrapper has been added which in theory should work in a distributed setting. With the current implementation, this does not seem to work. Any ideas why it does not, would be of great interest. If people are interested, a somewhat dated kaggle notebook is available here. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
See issue #5 for more details on the issue and what has been tried. |
Beta Was this translation helpful? Give feedback.
-
Multi-gpu support is now supported. Fixed in 7392bc0. The optimizer wrapper supports it seemlessly, whereas for the model wrapper experimental support for the SGD optimizer only is added. |
Beta Was this translation helpful? Give feedback.
Multi-gpu support is now supported. Fixed in 7392bc0.
The optimizer wrapper supports it seemlessly, whereas for the model wrapper experimental support for the SGD optimizer only is added.