Helper Module for Deep Learning with pytorch.
This work is made available by a community of people, amoung which the CEA Neurospin BAOBAB laboratory.
You can list all available Deep Learning tools by executing in a Python shell:
from pprint import pprint import pynet pprint(pynet.get_tools())
The 'get_tools' function returns a dictionary with all available 'networks', 'losses', 'regularizers', and 'metrics'.
Then each network has been embeded in a Deep Learning training interface providing a 'training' and a 'testing' method. Network parameters are set using the NetParameters object. You can list all these interfaces by executing in a Python shell:
from pprint import pprint import pynet pprint(pynet.get_interfaces(family=None)) params = pynet.NetParameters(param1=1, param2=2) params.param3 = 3
The 'get_interfaces' function returns a dictionary with interfaces sorted by family names. You can filter the result by providing the family name or a list of family names of interest.
You can list also all available data fetchers by executing in a Python shell:
from pprint import pprint import pynet.datasets import get_fetchers pprint(get_fetchers())
The 'get_fetchers' function returns a dictionary with all the declared fetchers. Finally you may want to look at the data manger class that provides convenient tools to split/stratify your dataset:
from pynet.datasets import DataManager
Make sure you have installed all the package dependencies. Further instructions are available here.