Static Data Types and More
With this release we have finally added support for static data type
information for tensors (not for symbolic tensors yet though -- for now
we effectively have support for a statically-typed version of numpy
for Scala). This is an important milestone and contributes significantly
to type safety, which can help catch errors at compile time, rather than
runtime. For example:
val t1 = Tensor(0.5, 1) // The inferred type is Tensor[FLOAT64].
val t2 = Tensor(1, 2) // The inferred type is Tensor[INT32].
val t3 = t1 + t2 // The inferred type is Tensor[FLOAT64].
val t4 = t3.isNaN // The inferred type is Tensor[BOOLEAN].
val t5 = t3.any() // Fails at compile-time because `any()` is only
// supported for Tensor[BOOLEAN].
Other new features include:
- Improvements to the high-level learn API:
- Layers can now provide and use their own parameter generator, and
can also access the current training step
(usingLayer.currentStep
). - Layers now support
.map(...)
. - Added support for batch normalization.
- Layers can now provide and use their own parameter generator, and
- Added support for
tf.logSigmoid
andtf.lrn
. - Added support for the following new metrics:
- Grouped precision.
- Precision-at-k.
data
module:- Added support for loading the extreme classification repository
datasets (i.e.,data.XCLoader
). - Added support for randomly splitting datasets.
- Added support for loading the extreme classification repository