You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current _SingleSymPyModule.forward(self, X) method flattens input column vectors (dim=2).
If we have a column vector x with shape=(L, 1) (dim=2) as input, then the expected behavior is for the output y of a module to also have the same shape (and dim). Currently y has shape (L, )
Besides a matter of consistency, the flattening may not technically matter for single vectors, but as soon as one has downstream modules that expect a certain shape, there's a problem. Indeed, I found the issue with a multivariate _SingleSymPyModule, which takes its inputs from 2 other such modules. As the code expects a matrix of column vectors as feature inputs, an error is thrown, as the second dimension no longer exists due to the flattening in upstream _SingleSymPyModules. But this of course is not limited to compositions of this type of module.
Specifically, on line 184 of the export_torch module:
The extracted X[:, i] has dimension 1. This means that _SingleSymPyModule itself expects column vector features, but because of upstream flattening of X by another _SingleSymPyModule, we get the error below:
which adds an extra dimension after the last shape index, so (L, ) becomes (L, 1).
Then, the subsequent evaluation self._node(symbols) preserves the shape.
torch.index_select returns a "tensor has the same number of dimensions as the original tensor."
though this might be overkill given the above assumption. I'm not exactly fluent in PyTorch tensor manipulations, but the first solution seems adequate here.
The existing TestTorch tests pass with both solutions.
Version
0.17.2
Operating System
Linux
Package Manager
pip
Interface
Other (specify below)
Relevant log output
No response
Extra Info
The issue does not depend on the interface.
The text was updated successfully, but these errors were encountered:
What happened?
The issue
The current
_SingleSymPyModule.forward(self, X)
method flattens input column vectors (dim=2).If we have a column vector
x
with shape=(L, 1)
(dim=2) as input, then the expected behavior is for the outputy
of a module to also have the same shape (and dim). Currentlyy
has shape(L, )
Besides a matter of consistency, the flattening may not technically matter for single vectors, but as soon as one has downstream modules that expect a certain shape, there's a problem. Indeed, I found the issue with a multivariate
_SingleSymPyModule
, which takes its inputs from 2 other such modules. As the code expects a matrix of column vectors as feature inputs, an error is thrown, as the second dimension no longer exists due to the flattening in upstream_SingleSymPyModule
s. But this of course is not limited to compositions of this type of module.Specifically, on line 184 of the
export_torch
module:The extracted
X[:, i]
has dimension 1. This means that_SingleSymPyModule
itself expects column vector features, but because of upstream flattening ofX
by another_SingleSymPyModule
, we get the error below:Error
Suggestion for a fix
If one assumes inputs are always matrices of features as columns, then a simple fix is:
which adds an extra dimension after the last shape index, so (L, ) becomes (L, 1).
Then, the subsequent evaluation
self._node(symbols)
preserves the shape.Alternatively,
torch.index_select
returns a "tensor has the same number of dimensions as the original tensor."though this might be overkill given the above assumption. I'm not exactly fluent in PyTorch tensor manipulations, but the first solution seems adequate here.
The existing
TestTorch
tests pass with both solutions.Version
0.17.2
Operating System
Linux
Package Manager
pip
Interface
Other (specify below)
Relevant log output
No response
Extra Info
The issue does not depend on the interface.
The text was updated successfully, but these errors were encountered: