Releases: kundajelab/deeplift
Added support for tanh activations and intermediate-layer sigmoid activations
Corresponds to PR #117 (sigmoid as the output layer was already supported)
Faster dinucleotide shuffling
Thanks to @atseng95 for PR #109, which greatly speeds up dinucleotide shuffling. This tag also incorporates the change from PR #105 (which has support for supplying pre-generated shuffled references, and was tagged as v0.6.11.0), as well as the small fix in PR #101 (which allows the user to recover if they accidentally set an invalid task index, and was tagged as v0.6.10.1)
Fix to avoid redundant resetting of mxts
Fix for models that don't have biases, target layer index error now a warning
Corresponds to features implemented in PR #93. Two features: (1) has a fix for loading models that don't have biases, and (2) a message about the target layer that would previously be thrown as a runtime error now just results in a warning message being printed, as there are legitimate situations where that edge case can occur. See #92 for the issue that prompted the changes.
Ability to reuse same shuffled references for multiple tasks
This feature was requested in #83 and was implemented in PR https://github.com/kundajelab/deeplift/pull/84/files. I forgot to merge it into the master branch at the time and am doing so now. The genomics notebook was updated to use this feature (and also updated to python 3) in #94
Added ability to provide a random state for dinucleotide shuffling
(so that the random state doesn't have to be controlled by setting the numpy random seed externally, which doesn't always play will with jupyter notebooks)
Support for dinucleotide shuffling directly on one-hot encoded sequences
Corresponding to feature added by @annashcherbina in #78
Coping with Keras 2.2.3 breaking change + GlobalAveragePooling layer
Python 3 fix in dinuc shuffle
This pull request: #62
Upgraded to work with latest tensorflow
(Tensorflow 1.10.1). Also updated some of the tests to work with Keras 2.2.