Skip to content

New fork of encoding information since my old one (EncodingInformationMine) got fried

License

Notifications You must be signed in to change notification settings

lakabuli/EncodingInformation

 
 

Repository files navigation

Documentation Status License PyPI PyPI - Downloads

Code and experiments from the paper Information-driven design of imaging systems. For detailed usage, see the documentation.

Installation guide

pip install encoding_information

There may be more setup required to get the correct versions of Jax/Flax, see:

https://github.com/Waller-Lab/EncodingInformation/blob/main/Installation_guide.md

Quick Start

from encoding_information.models import PixelCNN, PoissonNoiseModel
from encoding_information import estimate_information, extract_patches

# Load measurement data (N x H x W numpy array of images) 
measurements = load_measurements()  

# Split into training/test sets and extract patches
# Breaking large images into patches increases computational efficiency
# Test set is used to evaluate information estimates
patches = extract_patches(measurements[:-200], patch_size=16)
test_patches = extract_patches(measurements[-200:], patch_size=16) 

# Initialize and fit model to training data
model = PixelCNN()  # Also supports FullGaussianProcess, StationaryGaussianProcess
noise_model = PoissonNoiseModel()
model.fit(patches)

# Estimate information content with confidence bounds
# Error bars are calculated based on test set size
info, lower_bound, upper_bound = estimate_information(
   model, 
   noise_model,
   patches,
   test_patches,
   confidence_interval=0.95
)

print(f"Information: {info:.2f} ± {(upper_bound-lower_bound)/2:.2f} bits/pixel")

We provide three models with different tradeoffs:

  • PixelCNN: Most accurate estimates but slowest
  • FullGaussianProcess: Fastest
  • StationaryGaussianProcess: Intermediate speed; Best performance on limited data

For highest accuracy, train multiple models and select the one giving the lowest information estimate, as each provides an upper bound on the true information content.

Documentation

https://encodinginformation.readthedocs.io/en/latest/

Contributing

  1. Make a fork and clone the fork
  2. git remote add upstream https://github.com/Waller-Lab/EncodingInformation.git
  3. git config pull.rebase false
  4. Use pull requests to contribute
  5. git pull upstream main to pull latest updates from the main repo

About

New fork of encoding information since my old one (EncodingInformationMine) got fried

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.6%
  • Python 0.4%