New feature
The main feature of this patch release is that AccumBN can now be used as drop-in replacement for any BatchNormalization layer, even for pretrained networks. Old weights are sufficiently transferred and documentations have been updated to include how to do this.
import tensorflow as tf
from gradient_accumulator import GradientAccumulateModel
from gradient_accumulator.layers import AccumBatchNormalization
from gradient_accumulator.utils import replace_batchnorm_layers
accum_steps = 4
# replace BN layer with AccumBatchNormalization
model = tf.keras.applications.MobileNetV2(input_shape(28, 28, 3))
model = replace_batchnorm_layers(model, accum_steps=accum_steps)
# add gradient accumulation to existing model
model = GradientAccumulateModel(accum_steps=accum_steps, inputs=model.input, outputs=model.output)
What's Changed
- Docs: Support tf 2.2-2.12 by @andreped in #100
- Allow poorer approximation for older tf versions in model test by @andreped in #101
- Fixed typo in setup.cfg by @andreped in #104
- Ignore .pyc [no ci] by @andreped in #106
- Delete redundant .pyc file [no ci] by @andreped in #107
- Added Applications to README by @andreped in #109
- Fixed whl installation in test CI by @andreped in #110
- Added method to replace BN layers by @andreped in #112
Full Changelog: v0.5.1...v0.5.2