Releases: pronobis/libspn-keras
Releases · pronobis/libspn-keras
v0.6.0
What’s Changed
Sampling Backprop Op
- Add a
SumOpSampleBackprop
so that sampling is done easily. - Add a
zero_evidence_inference
method toSequentialSumProductNetwork
that can be used to infer values for all variables. Combined withSumOpSampleBackprop
this results in samples from the network. - Add a notebook with an example of how to sample from an SPN
Simplify PoonDomingosMeanOfQuantileSplit
- Remove support for normalization within. Expect data to be normalized.
- Accept
tf.data.Dataset
to allow for more consistency with class and method signatures of other library components.
Add PoonDomingosStddevOfQuantileSplit
- Add initializer for leaf scales based on same logic as
PoonDomingosMeanOfQuantileSplit
for leaf locations.
Other
- Fixes in docs
- Loosen dependency constraints so that installing
libspn-keras
is less likely to result in conflicts
v0.5.2
What’s Changed
* Fix issues with log conv (#27) @jostosh
v0.5.1
What’s Changed
- Fixes Conv2DSum forward pass
- Adds
Clip
constraint - Adds constraint parameter to
NormalLeaf
layer - Adds option to normalize with cross-sample statistics for
NormalizeStandardScore
layer
v0.5.0
What’s Changed
- Add normalized greater equal epsilon constraint
- Simplify API for building random SPNs. No need to determine
factors
array forPermuteAndPadScopesRandom
anymore when usinglibspn_keras.models.SequentialSumProductNetwork
. - Normalize EM updates and use 'gliding averages'
- Add notebook for benchmarking against Einsum Networks
- Add Dirichlet initializer for accumulators