You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using more than one nn.sequential modules, and they both use the same activation functions defined under init, torchinfo splits the single nn.sequential in separate ones at each activation function call.
When using more than one nn.sequential modules, and they both use the same activation functions defined under init, torchinfo splits the single nn.sequential in separate ones at each activation function call.
For example:
results in the following summary:
Even though the
secondNetwork
is unused, changing one of theself.actFun
calls to e.g. nn.LeakyReLU fixes the problem:Is this a torchinfo problem or am I maybe doing something wrong here?
The text was updated successfully, but these errors were encountered: