Skip to content

BEMB v.0.1.6

Compare
Choose a tag to compare
@TianyuDu TianyuDu released this 20 May 21:16
· 19 commits to main since this release

What's Changed

Users can now specify the optimizer they want using the model_optimizer argument while initializing the model as following:

bemb = LitBEMBFlex(... model_optimizer="Adam", ...)
bemb = LitBEMBFlex(... model_optimizer="LBFGS", ...)

The optimizer specified needs to be in torch.optim.

We have developed a cleaner mode estimation pipeline blackened by PyTorch-Lightning:

from bemb import run
run(bemb, dataset_train=dataset_train, dataset_val=dataset_val, dataset_test=dataset_test, batch_size=len(dataset_train) // 20, num_epochs=1000, device="cuda")

import the run directly from bemb package to use it.

New Contributors

Full Changelog: v0.1.5...v0.1.6