BEMB v.0.1.6
What's Changed
Users can now specify the optimizer they want using the model_optimizer
argument while initializing the model as following:
bemb = LitBEMBFlex(... model_optimizer="Adam", ...)
bemb = LitBEMBFlex(... model_optimizer="LBFGS", ...)
The optimizer specified needs to be in torch.optim
.
We have developed a cleaner mode estimation pipeline blackened by PyTorch-Lightning:
from bemb import run
run(bemb, dataset_train=dataset_train, dataset_val=dataset_val, dataset_test=dataset_test, batch_size=len(dataset_train) // 20, num_epochs=1000, device="cuda")
import the run
directly from bemb
package to use it.
- fix typo. by @TianyuDu in #20
- commits for supermarket and deterministic vi by @kanodiaayush in #23
- 30 add lbfgs support and cleaner pytorch lightning training loops by @TianyuDu in #32
New Contributors
- @kanodiaayush made their first contribution in #23
Full Changelog: v0.1.5...v0.1.6